Why I Think the New “Pop-Up” Study is Flawed
Ars Technica has details of a new study from the excellent (yes, I’m biased) Psychology Department of North Carolina State University. Basically, they created a series of “pop-up” windows designed to look like system messages, but could easily have been viruses.
Of the 42 students, 26 clicked the OK button for the “real” dialog. But 25 clicked the same button for two of the fakes, and 23 hit OK on the third (the one with the status bar showing). Only nine of them closed the window—two fewer than had closed the real dialog. In all cases, a few of the users simply minimized the window or dragged it out of the way, presumably leaving the machine’s next user at risk.
Now, I’m not a psychology major, but I did stay in a Holiday Inn Express study it for two years back in Ole Blighty–so I’d like to think I have some insight on experiments like this.
OK, here’s why I think the data is flawed. It appears the experiment was carried out on the college’s computers, likely in some research lab. With that in mind, why would anyone (especially a college kid) care about what might happen to a computer they don’t own? In that same situation, I’d assume that perhaps the college had virus protection installed or simply not take ownership of my actions.
If the above assumption is correct, I’d consider the data useless until the study is conducted again–this time using the student’s own computer, in their own home, with perhaps more than 42 participants. I suspect (“hypothesize” in psych speak) that they’d be a lot more cautious in their clicks.
Let me know your observations (now that I’ve biased them with my analysis)