Then recode the data into 23 new variables.Ĭrosstab the raw responses by the keyed responses. Then you should either enter the data a second time and do a file I suggest that you enter the data exactly as the students answered. One thing that can cause a low alpha is that you miskeyed the data.Į.g., there is more than one answer that could be correct or you chose PS - I read elsewhere that reading from Google Groups wasĭisrupted for a couple of days. That is the baseline sample for considering the variability of your A wider range impliesĬonsider your reference source and what it says about its sample. Score than (say) if you intentially excluded dieticians and chefsįrom a sample of "ordinary people". In terms of how much they know, you would get a higher Reliability So if your sample includes a wide range of people It is not simply forgotten, is that any reliaibility is also a measure What is always true about reliability, but is often irrelevant when Items that to many people are rather like random trivia. Intercorrelations are not suspicious, after all. Symptoms (which would be more highly correlated), I suspect that your Now that you have explained that your data are not attitudes or That solution implies that you could enter your data as theĪlphabetic response (or numbers if you want), and check against Predictable from one or more other items.
More cases than items, one or more items is (close to) perfectly What the error message is telling you is that in the remaining dataĪfter listwise deletion there are more items than cases, or if there are There is more than one construct the respondents might be using to answer. Is there any pattern of negative correlations in the inter-itemĬorrelations which might hint that there was an error in scoring or that What do the "corrected item-total correlations" look like? Is this a scale that has been used in prior research or did you put itįrom the printout what are the low and high inter-item correlations? Or ability? How id you decide what is a "correct" answer? What is the construct you are trying to measure? Some kind of knowledge
#GODS PLAN DRAKE VIDEO MANUAL#
I usually suggest entering the data "asĭid you double check your manual recoding?
#GODS PLAN DRAKE VIDEO SOFTWARE#
As a rule of thumb, it is better to let the software do Well, entering the data as zero & one is not wrong. The regression will drop out some variable(s) as being Outcome for the regression - "ID" - and use all the items as If I recall right, you can locate more subtle dependencies by a clever Know that it would pick out any "duplex" or pair that correlate Procedure might do something to flag zero determinants, and I There will be factors, and they should make have someįace validity as factors for this scale. Perform a factor analysis, just to see what structure is there -Īnd to make sure of stuff like, "right variables", "right labels",Īnd so on.
As I mentioned, any item being constant gives a zeroĭeterminant, or any two items being identical in answers.Īny time that I have a set of questionnaire items, I ordinarily What the data actually are, so I wasn't sure they were Subjects.Īre any of the means really low/high? Are any of the correlationsĮspecially high? I'd look at that information before I ever looked at >However, what does it mean by 'there are fewer rows than there are >I now understand what is determinant and how the determinant be zero. This is guaranteed to be possible when there are fewer subjects Of rows can be "added" (with multipliers allowed) to create some other However, there is a linear dependency whenever any combination The second simplest may be when you have included a total along The simplest linear dependency is when two items haveĮxactly the same (or exactly opposite) responses. there is some linear dependency among rows. there are fewer rows than there are variables some item is a constant (either 0 or 1 in all rows for 0/1) Non-zero components is the minimum of the number of variablesĪnd the number of rows. Set of components of varying sizes the determinant may be When you do a principalĬomponent-type of decomposition of a matrix, you get a I take it that you do not know what a determinant is, so that It depends mightily on what the subject area is.
0.591 is not necessarily a terrible alpha for 23 binary items, but