Willkommen im Online-Support von SoSci Survey.

Hier bekommen Sie schnelle und fundierte Antworten von anderen Projektleitern und direkt von SoSci Survey.

→ Eine Frage stellen


Welcome to the SoSci Survey online support.

Simply ask a question to quickly get answers from other professionals, and directly from SoSci Survey.

→ Ask a Question

0 votes

Hi,

Is there a way to disable individual questionnaires within a survey?

I have insufficient numbers of participants for some questionnaires, although all should have been equal... To make up for this I was thinking of disabling the questionnaires for which I already have enough participants. This will only leave the ones for which more participants are needed as active. The other way I see is with the ballots but I would like to avoid playing with them at this stage...

Thank you!

in SoSci Survey (English) by s173394 (195 points)
> I have insufficient numbers of participants for some questionnaires

How do you distribute respondents to the questionnaires?

> To make up for this I was thinking of disabling the questionnaires

Bad idea, you might heavily confound your data, IF you are doing an experiment: https://www.soscisurvey.de/help/doku.php/en:create:questions:random#readjust_drawings
How do you distribute respondents to the questionnaires?
- Through the Prolific platform: All respondents see the general link to the base questionnaire and click on it. But what probably happens is that while some are still doing the survey, it counts as already completed. Later they time out on Prolific, but SoSci counts their submission as assigned and doesn't assign the same questionnaire again. At the same time Prolific opens up a slot for a new participant to click on the link (because I have specified on Prolific how many participants I need and if someone takes too long to complete it, a new person is allowed to enter). And the new participant is then assigned to a different questionnaire because the other one counts as full. Or that's at least how I can explain it...

In the meantime I have decided that for now I will leave things as they are and will not try to make up for the unequal numbers. It was not crucial to the study, it was only needed to test consensus. But it can be done with the current data. It would just be difficult to explain and I am trying to understand it and avoid it in the future.

Thanks a lot!
> All respondents see the general link to the base questionnaire and click on it

My question was more: What PHP code are you using in SoSci Survey to distribute the respondents to the different questionnaires? And if you are using a random generator, what have you set for the distribution?

> It would just be difficult to explain and I am trying to understand it and avoid it in the future.

If you provide the information about PHP and random generator, I'll try and explain :)
Ah, I see, sorry :)

I followed the steps just as they are outlined in the manual for random generator "equally distributed": https://www.soscisurvey.de/help/doku.php/en:create:random_questionnaire

Including the PHP code given on that page:
$qnr = value('RF01', 'label');
goToQuestionnaire($qnr);

1 Answer

+1 vote

Thank you for posting the PHP code.

First think you should take a look at is the random generator RF01. You can see the counter for every item on the right side.

If these number exceed the number of completed/usable questionnaires, then you may have an issue with drop-outs. If the drop-out is already on the first page, you may consider putting the welcome page (where most people close the browser tab, compared to later occasions) in the main questionnaire, and the PHP code on page two.

Generally, this solution will not (!) check if respondents do the questionnaire or close the browser window immediately (or anything in between). Therefore, if you have a severe mis-distribution, then this might be an effect of your stimulus. Creating a cross-table between QUESTNNR, FINISHED and LASTPAGE is usually a good start.

by SoSci Survey (200k points)
Thanks so much!

I took at look at the random generator counter and the columns QUESTNNR, FINISHED, and LASTPAGE in the data. Yes, there seem to be a couple of drop-outs indeed. But only sometimes right after the start page. Some respondents seem to have left mid-study. It may be because the task is a bit repetitive. But it's better to reduce drop-out rate as much as possible, so next time I will put the random generator after the first page - thank you for suggesting this!

Otherwise the mis-distribution is not so big - only in a few cases. I think it should be possible to analyze it like that. I will not try to correct for it at this stage - it might cause more trouble.

Thanks again for all your help!
...