Jump to content

Recommended Posts

As in all exams there are questions that do not have definitive answers. Believe me I know, having more than 20 years as a career firefighter, we were tested continually either by the State of Illinois, for promotions, or daily drills with exams that followed. All of the tests had a similarity to them. Many exams did not have the answers that we wanted to mark correctly on the question. Even so, all tests are continually being subjected to updating and made relevant to the subject matter. No test is perfect, but at least someone is taking the time to improve test questions and their objectivity.

One thing that comes to mind is,"Is the test taker reading too much into the question?" When I took State of Illinois course to challenge the State test, the instructor repeatedly informed us to not read into the question , take it at face value. Yes some of the answers were not totally relevant to the question, but the instructor advised us to use the answer that closely resembles the true answer. During our exams in class one could see the improvement in test scores as time went on.

So lets not shoot the guy/gal who takes the time to improve the process, but aide him or her on how to improve those processes and enhance our profession.

Link to post
Share on other sites
  • Replies 141
  • Created
  • Last Reply

Top Posters In This Topic

Well said, Kurt!

I am an ASHI member, A NAHI-CRI, a NACHI member and while my state's licensing is voluntary I'm applying for that too. Why not? Credentials can never hurt you.

Any degree of learning is helpful and redundancy reinforces what you know.

I just try to avoid the politics and mud slinging like the plague. No good can come of it.

Link to post
Share on other sites

Mike,

A credential is worth having only it has merit based on and/or indicating a level of expertise or professionalism. Any other use of an acronym borders on fraud. The use of some acronyms fuels and funds forces in direct opposition to the valued qualities listed above.

Link to post
Share on other sites

I understand what you are suggesting, Chad, and I hope and trust that in time there will be no further need for discussion regarding the subject.

As a publisher and vendor to the home inspection industry for years offering educational, reporting and marketing materials and services, I hope you can appreciate how "in the middle" of the crossfire I am and how much I long for peace. For the last three years I have attended all of the national association conventions.

As you can well imagine, I am a disappointed, dismayed and weary observer of this ongoing skirmish...

This is why I rally behind the last paragraph of Kurt's most recent post.

I love, appreciate and very much care about this industry. My books are testimony to that fact, especially The Zen of Home Inspection.

Best wishes to all!

Link to post
Share on other sites
  • 1 year later...

About two months ago is when I first began to contemplate entering the HI world. On the second day I found the Nachi exam and took it. It took me the entire 120 mins and I got all but maybe 2 questions answered. A score of 80 is needed to pass and I got a 73.

I'm not an expert of anything but I know a some stuff about most aspects of homes and their systems. I feel ok with a 73 cold turkey. Since then I have been hip deep in training materials and the mound is growing.

As my head swells with more knowledge and data I will eventually return and take the test again. I expect to pass it the second time around. I needed the most help in plumbing and structure. My strong point was electrical.

Link to post
Share on other sites

The test is easy enough just like any other test.

The key is in the continuing education.

The more you take the more you learn.

Make sure you join a local chapter as this is where you will find the real help.

No bashing here just check out the locals.They are important.One may be stronger than another depending on the area.

Link to post
Share on other sites

The most valuable education is going on inspections with some one who knows what he/she is doing. If just two months ago you weren't ready to pass a written exam, you've got a while to go. Keep your eyes open, and have fun.

Link to post
Share on other sites

Gerry

Please excuse my late entry to this thread as I just found it because I have only recently joined this group.

As a long time member of the entry level & master exam committee of the California Real Estate Inspection Association (1999) and an associate professor at the College of San Mateo's Building Inspection Technology division since 1994 I’m quite familiar with quizzes and exams.

Having never seen the NATCHI exam I’m curious to know how it was put together? Was there a committee involved? Was a role delineation study performed and are the exam questions and selected answers psychometrically validated? Composing a really excellent exam for any industry or profession is very difficult at best and takes an enormous amount of committee work. I know this only through the experience I’ve gained through both CREIA and my teaching career. I’m sure Jim Katen will back me up on this and anybody who has served on the NHIE committee. It truly is ass-numbing work.

BTW, your pass rate for first time takers is in the general ball park.

Link to post
Share on other sites

Just as with State exams it is a moot point with the real judge being what you do out in the field.

There may be guys with a ton of knowledge out there and great people skills that freeze when taking a test.

Eventually the business itself is the real score.

Link to post
Share on other sites
Having never seen the NATCHI exam I’m curious to know how it was put together? Was there a committee involved? Was a role delineation study performed and are the exam questions and selected answers psychometrically validated?

Geez Jerry, I'm glad you came over here...you crack me up. Lord, I literally slapped my knee.

Link to post
Share on other sites
Originally posted by JerryM

. . . Composing a really excellent exam for any industry or profession is very difficult at best and takes an enormous amount of committee work. I know this only through the experience I’ve gained through both CREIA and my teaching career. I’m sure Jim Katen will back me up on this and anybody who has served on the NHIE committee. It truly is ass-numbing work. . . .

I'll enthusiastically back that up. Developing a psychometrically valid exam is very difficult, very time consuming and requires collaboration among a large group of people. It also never ends. The test must change constantly. It's a huge job to do correctly.

- Jim Katen, Oregon

Link to post
Share on other sites

Yes. I'll join the backup on that also.

I sat in as volunteer for a number of days as guinea pig evaluator for the NHIE; developing psychometrically validated exams is ass numbing work.

I'm not aware of any similar effort for the NACHI test. IOW, it's not psychometrically valid, and therefore, a silly and essentially worthless enterprise.

NACHI's failure to recognize what an entire world of education recognizes as an appropriate test development protocol is another one of those things that, um.... I'd better stop now.....

Link to post
Share on other sites

I thought I knew a thing or two about writing exams, but this deal below has me somewhat stumped. Perhaps JK or some other learned member on this BB could interpret? [:-crazy]

Thanks in advance.

To earn the title of Certified Inspector, XXXXXXXXX has passed a test that has been developed consistent with the AERA, APA, NCME Standards for Educational and Psychological Testing to include a documented content framework and test blueprint which has been validated via a role delineation study, documented field testing, and statistical analysis documenting among other things “mean p-values,â€

Link to post
Share on other sites
Originally posted by ozofprev

Originally posted by chicago

My test score at the state exam was to my understanding developed by ASHI and I had no problem with the test.

When I joined NACHI the questions were tougher.Enough said.Lets not go there.

OMG!

Nope, the Illinois home inspector exam is a hybrid of the NHIE. The state has added a state specific module to the NHIE. ASHI had nothing to do with the exam in Illinois, other than some folks that participated in the exam writing sessions for EBPHI might have belonged to ASHI.

The NHIE is not a difficult exam if you know the material. It is not a tricky exam, it is very straight forward.

The big difference between the NHIE and other exams is this:

At this time, 19 state regulatory bodies have adopted the NHIE. Only the American Society of Home Inspectors and the American Institute of Inspectors requires that members pass the NHIE. Other organizations may utilize member testing of their own, but these tests are not focused on public protection.

EBPHI is an independent not-for-profit organization – not a home inspector membership association such as NAHI, ASHI and NACHI. EBPHI has neither a political nor a market protection agenda, and it does not depend on membership dues revenue. It is free to focus wholly on consumer protection in home inspector competency assessment.

This is the difference between the NHIE and the other guys.

Link to post
Share on other sites
Originally posted by jhagarty

Kurt,

Please define "psychometrically valid".?

What makes the ASHI Exam psychometrically valid in your opinion?

The only exam that ASHI has is their Standards and Ethics exam. ASHI uses the NHIE for their technical exam.

Link to post
Share on other sites

Well, I hate to do this, but you asked....... Also, what Scott just noted about ASHI's test is correct; I misspoke by calling the test "ASHI's test". It is the NHIE, and ASHI uses it for the technical test.

Psychometric theory involves several distinct areas of study. First, psychometricians have developed a large body of theory used in the development of mental tests and analysis of data collected from these tests. This work can be roughly divided into classical test theory (CTT) and the more recent item response theory (IRT). An approach which is similar to IRT but also quite distinctive, in terms of its origins and features, is represented by the Rasch model for measurement. The development of the Rasch model, and the broader class of models to which it belongs, was explicitly founded on requirements of measurement in the physical sciences (Rasch, 1960).

Second, psychometricians have developed methods for working with large matrices of correlations and covariances. Techniques in this general tradition include factor analysis. One of the main deficiencies in various factor analysis is a lack of cutting points.

A usual procedure is to stop factoring when eigenvalues drop below one because the original sphere shrinks. The lack of the cutting points concerns other multivariate methods, too. At the bottom, psychometric spaces are Hilbertian but they are dealt with as if Cartesian.

Therefore, the problem is more of interpretations than utilizing a method. (finding important underlying dimensions in the data), multidimensional scaling (finding a simple representation for high-dimensional data) and data clustering (finding objects which are like each other). In these multivariate descriptive methods, users try to simplify large amounts of data.

More recently, structural equation modeling and path analysis represent more sophisticated approaches to solving this problem of large covariance matrices. These methods allow statistically sophisticated models to be fitted to data and tested to determine if they are adequate fits.

Key concepts

The key traditional concepts in classical test theory are reliability and validity. A reliable measure is measuring something consistently, while a valid measure is measuring what it is supposed to measure. A reliable measure may be consistent without necessarily being valid, e.g., a measurement instrument like a broken ruler may always under-measure a quantity by the same amount each time (consistently), but the resulting quantity is still wrong, that is, invalid. For another analogy, a reliable rifle will have a tight cluster of bullets in the target, while a valid one will center its cluster around the center of the target, whether or not the cluster is a tight one.

Both reliability and validity may be assessed mathematically. Internal consistency may be assessed by correlating performance on two halves of a test (split-half reliability); the value of the Pearson product-moment correlation coefficient is adjusted with the Spearman-Brown prediction formula to correspond to the correlation between two full-length tests. Other approaches include the intra-class correlation (the ratio of variance of measurements of a given target to the variance of all targets).

A commonly used measure is Cronbach's á, which is equivalent to the mean of all possible split-half coefficients. Stability over repeated measures is assessed with the Pearson coefficient, as is the equivalence of different versions of the same measure (different forms of an intelligence test, for example). Other measures are also used.

Validity may be assessed by correlating measures with a criterion measure known to be valid. When the criterion measure is collected at the same time as the measure being validated the goal is to establish concurrent validity; when the criterion is collected later the goal is to establish predictive validity. A measure has construct validity if it is related to other variables as required by theory. Content validity is simply a demonstration that the items of a test are drawn from the domain being measured. In a personnel selection example, test content is based on a defined statement or set of statements of knowledge, skill, ability, or other characteristics obtained from a job analysis.

Predictive or concurrent validity cannot exceed the square of the correlation between two versions of the same measure.

Item response theory models the relationship between latent traits and responses to test items. Among other advantages, IRT provides a basis for obtaining an estimate of the location of a test-taker on a given latent trait as well as the standard error of measurement of that location. For example, a university student's knowledge of history can be deduced from his or her score on a university test and then be compared reliably with a high school student's knowledge deduced from a less difficult test. Scores derived by classical test theory do not have this characteristic, and assessment of actual ability (rather than ability relative to other test-takers) must be assessed by comparing scores to those of a norm group randomly selected from the population. In fact, all measures derived from classical test theory are dependent on the sample tested, while, in principle, those derived from item response theory are not.

Standards of quality

The considerations of validity and reliability typically are viewed as essential elements for determining the quality of any test. However, professional and practitioner associations frequently have placed these concerns within broader contexts when developing standards and making overall judgments about the quality of any test as a whole within a given context.

Testing standards

In the field of psychometrics, the Standards for Educational and Psychological Testing [1] place standards about validity and reliability, along with errors of measurement and related considerations under the general topic of test construction, evaluation and documentation. The second major topic covers standards related to fairness in testing, including fairness in testing and test use, the rights and responsibilities of test takers, testing individuals of diverse linguistic backgrounds, and testing individuals with disabilities. The third and final major topic covers standards related to testing applications, including the responsibilities of test users, psychological testing and assessment, educational testing and assessment, testing in employment and credentialing, plus testing in program evaluation and public policy.

Evaluation standards

In the field of evaluation, and in particular educational evaluation, the Joint Committee on Standards for Educational Evaluation [2] has published three sets of standards for evaluations. The Personnel Evaluation Standards [3] was published in 1988, The Program Evaluation Standards (2nd edition) [4] was published in 1994, and The Student Evaluation Standards [5] was published in 2003.

Each publication presents and elaborates a set of standards for use in a variety of educational settings. The standards provide guidelines for designing, implementing, assessing and improving the identified form of evaluation. Each of the standards has been placed in one of four fundamental categories to promote educational evaluations that are proper, useful, feasible, and accurate. In these sets of standards, validity and reliability considerations are covered under the accuracy topic. For example, the student accuracy standards help ensure that student evaluations will provide sound, accurate, and credible information about student learning and performance.

As near as I can tell, the NHIE has hewed to these accepted test development practices. They have employed, at substantial expense, some of the foremost psychometrician's from Duke University to consult & oversee the process.

There has been an ongoing and extensive review; as Katen previously noted, the process never ends.

Counter-intuitively, a psychometrically valid test doesn't necessarily seem harder or easier than any other test. One defining characteristic is being able to show statistically, that a test taker will score approximately the same correct percentage score whether the test is 150 questions, or 3000 questions; the key to this is amazingly intense review of questions and how they are structured.

I hope I answered your question......

Link to post
Share on other sites

My opinion has nothing to do w/it. I have no opinion.

There is only the fact that EPBHI has taken the time, effort, and expense to develop an exam that is psychometrically valid, and it is called the NHIE. It is psychometrically valid because it satisfies the criteria that you seem anxious to dismiss because I cut & pasted it.

Of course it's cut & pasted, but you asked, so I answered. Whadda you think I'm gonna do; sit there and try to explain an exceedingly complex topic by myself to someone that apparently doesn't have the first bit of knowledge about what we're talking about?

There are very few stupid questions, but yours is a good example of one. And, try to get it right; it's not the ASHI exam.

One of the truly brilliant & purposeful accomplishments of ASHI was recognizing the need for a comprehensive exam to establish a minimum level of competency in this profession.

Recognizing that having said test be the product of a single professional organization was counterproductive to the advancement of the profession, the Directors @ ASHI correctly split off test development to an entirely separate & independent organization. That is the EBPHI, which stands for the Examination Board of Professional Home Inspectors. If you want to learn more, go here.....

http://www.homeinspectionexam.org/

What Scott so gently keeps stressing is correct; there is no ASHI test. It is the NHIE. Period.

Link to post
Share on other sites
Originally posted by jhagarty

Kurt,

Anyone can cut and paste.

What makes the ASHI Exam psychometrically valid in your opinion?

Joe, are you talking about the NHIE? This is the exam that ASHI requires. You have taken the NHIE and you are a very intelligent and an experienced home inspector. I think that you already know the answers to your questions. The links that I posted go into detail about psychometric exams.

http://www.castleworldwide.com is one of the leading psychometric companies and has been involved in validating the NHIE.

Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


×
×
  • Create New...