survey process

The Pollfish Survey Process for Reviewing and Approving Surveys

The Pollfish Survey Process for Reviewing and Approving Surveys

survey process

The survey process at Pollfish involves a variety of factors and actions that ensure quality results from your deployed surveys. 

As soon as a researcher hits “checkout” on the Pollfish online survey platform, their survey undergoes a process of survey review and approval before it launches.

This way, the researcher can rest assured that their survey is error-free and the questions make sense, should they choose to add survey logic or advanced skip logic. The last thing you would want is to have poorly structured surveys that confuse your target audience.

After a survey gets approved and is launched, it still undergoes a rigorous post-launch survey process using artificial intelligence, as the survey is being completed and responses are gathered. 

This weeds out quality data and prevents survey fraud from the Pollfish network. 

This article explains the survey process of reviewing and approving surveys before they get deployed on our massive publishers’ network. 

How the Review and Approval Survey Process Works

After you reach the third (and final) section of the Pollfish online survey platform — the checkout action — you can check out your survey by hitting the “Submit for approval” button.

This begins the review and approval process, in which members of the Pollfish review team — examine your survey before launching it. Not all surveys will be launched upon review. Some surveys may require you to go back and fix an issue that the Review team had flagged.

If the team has any concerns with any aspect of a survey, they will communicate them in the form of comments. These comments can exist as an add-on to a question or its structure. Thus, you’ll see them in your survey after the Review team adds them.

Comments can be either mandatory or optional. When researchers receive both mandatory and optional comments, they must address both forms of comments, otherwise, their survey will not launch. With mandatory comments, you must edit your question according to the suggestion(s) in the comment.

With optional comments, although you do not have to follow their orders, you still ought to acknowledge them so that your survey can go live. To do so, you’ll need to click the accompanying “OK, got it” button.

Once you’ve responded to the comments, by either following their orders or clicking the acknowledgment button, the Review team will be notified and your survey will be under review once more. 

From there on, the team can approve the survey and launch it immediately across our wide network of publishing partners. These include websites, apps and mobile sites. 

Before the Review team approves a survey, researchers can withdraw it. 

This process does not apply to the Distribution Link feature, in which researchers send surveys to specific individuals, rather than a large network of visitors of different digital properties. 

Elite accounts can contact their account manager if they need extra tips or advice for the review process. 

What Happens in the Review and Approval Survey Process 

When the survey review process starts, the survey under review is locked, thus, no changes can be made to it by the researchers. 

If the researcher wants to make any last-minute changes, they must contact the team to apply them. This can be done via the support chat or a phone call. 

survey process

During the review and approval survey process, the Review team searches for errors that could cause the client's survey to not function properly. The goal of the team is to intervene with the survey design as little as possible, leaving the original design up to the use of the platform.

The review team suggests best practices based on our methodology, whenever they are applicable, along with checking for compliance with our policies.

The team will start flagging aspects of the survey by making mandatory comments (in which questions or aspects of the survey must be edited/changed). Optional comments offer suggestions to optimize the design of the platform to get the best possible outcome for your market research goals. 

The review process can be completed in as quick as several minutes or take up to 2 hours. This is because there may be a lot of surveys in the pipeline. Elite surveys have a higher priority in the review queue.

Mandatory and Optional Comments

Not all surveys will receive comments, as the review team adds them to resolve errors. If there are no errors, the survey is launched immediately after the review.

When you have both types of comments in a single review — mandatory and optional,  the survey goes back to draft status. Also, in the case of both, they go to a draft state, in which researchers need to edit and resubmit the survey.

Mandatory comments or both mandatory and optional comments render the survey to a draft state and researchers must apply the changes asked for in the comments. 

In the previous cases, researchers will need to edit their survey again and resubmit it for review. After sending a survey back with mandatory comments, the survey goes back to the draft status — the draft tab in the dashboard.

As such, a survey with mandatory comments goes to a draft state, in which researchers need to edit and resubmit the survey.

Optional comments only grant the survey an approved state, but in the case of both mandatory and optional comments, researchers still must acknowledge the comments. After hitting “OK, got it” in optional comments, researchers can then launch their survey. 

The comments can be seen within the questionnaire. 

If the review consists of optional comments only, the researcher can proceed with two options. They can proceed with one of these two actions directly from the review email that they’ll have received. 

The first optional action is to launch the survey directly, with no review process thereafter and without opening the questionnaire again. 

To do so, you would need to click on the 3 dots next to the survey in the ''approved'' tab in their dashboard, then select “launch now.'' 

The second option is to edit the questionnaire if they seek to proceed with the optional suggestions and go through the review process again. To do so, you would need to click on the 3 dots next to the survey in the ''approved'' tab on the dashboard and select '’edit.''

If researchers don’t make any changes to mandatory comments, they’ll repeat the survey process, as the survey will be blocked. It will be blocked until all mandatory options are addressed by adding their respective changes. 

For example:

If there are 10 mandatory comments, but the researcher only made changes to 2 of them, the survey will still be blocked and they’ll continue receiving emails to address the comments. 

If the researcher resubmits the survey without making any changes, the process will repeat. They also have the option to contact the support team.

In short, optional comments get an approved state. Researchers can edit if they want and resubmit the survey, or launch it without resubmitting.

Pollfish could be flexible when it comes to specific mandatory comments. For example:

In a question that asks for the sentiments of respondents, typically, statements that are contradictory in multiple selection questions will be flagged. The team will make a comment for this kind of question to be changed to a single selection question.

However, if the purpose is to collect all sentiments of the respondents, due to their complexity, the researchers can ask for the mandatory status to be removed by either replying to the email (more on emails in the second section below) or by chatting/calling a support team member.

They can chat within the survey. Then, a review member will evaluate the request, and if it complies with our policies, they will disregard the comment.

How Surveys Are Rejected in the Process

The review team can reject the surveys for any reason in the system. If a survey is rejected (due to one or multiple comments that need to be either acknowledged or fixed), it will be sent back.

Aside from technical issues in the comments, which can relate to improper survey logic or any of the types of survey questions wrongly formed, surveys can also be rejected for the following reasons:

  1. Violating the terms of the survey
  2. Design issues

Unique Respondents Across Surveys

If researchers need to send them to different respondents, they must reach out via chat and tell the support team that they have two or more surveys and want to group them on the backend so they will get unique respondents. This must be done before submitting the survey at the checkout. 

For example, they may have the same audience to see a campaign: highly educated people in the US, but need different individuals in the audience to see a certain campaign survey, and others to see a different survey.

Scheduled Surveys

When the team approves a survey that is scheduled, the survey gets an approved status. This means it goes to the approved tab until the launching date and time.  

Researchers must acknowledge comments even if the survey is scheduled. If they don’t acknowledge the comments, the survey won’t launch.

To make changes or acknowledge the comments, you must edit the survey, and then checkout. Doing so will enact the required or optional change. By clicking on “submit,” the survey process for review and approval starts again.

Emails in the Survey Process

Emails play a major role in the survey process of review and approval, as they are the main communication tool between the review team and platform and the users of Pollfish.

Researchers will receive a series of emails. The first email is a confirmation that you submitted the survey for approval. Often, the second email relays that your survey requires action. This type of email will include notice of either mandatory or optional comments.

The optional email will be titled as such: “Our survey review team has recommendations only.” 

survey process

The mandatory email will be titled as such: “ACTION REQUIRED: Please edit your survey [name of the survey] for it to start.” The body of both kinds of emails will inform you that your survey has recommendations. However, the mandatory one is marked as such in red in the body.

If you’ve made all the necessary changes to the mandatory comments and acknowledged all the optional ones, you will then resubmit the survey. 

Issues in the Survey Process that Prompt Comments

The review team comes upon different issues during the survey review and approval process. As aforementioned, these issues hamper the launch of the survey and are up to the researcher to fix. 

The previous section on how surveys are rejected gives a few key examples of issues that cause the team to block the survey and place comments for the researcher to address.

The following lists more key issues that prompt comments and temporary blocks to surveys:

  1. Personal Information
    1. Per the Pollfish terms and conditions, we do not allow questions collecting personally identifiable information and/or contact information (such as name, address, place of work, phone number, email address, etc.) to protect the respondents’ privacy and anonymity. 
    2. We also do not allow researchers to share their own personal or contact information through the survey.
  2. Links/URLs in questions/answers
    1. We do not allow researchers to insert links within questions redirecting respondents outside the survey environment; per our methodology, a respondent's participation will be considered abandonment if they click on the link.
  3. Proper language selection
    1. The survey’s language must be the same as the language selected on the audience page, at the upper left corner of the Audience page. 
  4. Questions about Age or Gender 
    1. Pollfish shares by default each respondent's age range on the results page and their specific year of birth (and therefore age) in the downloadable Excel file.
    2. We also provide, by default, each respondent's gender in the results.
  5. Age restrictions regarding the survey's content
    1. Based on the content of the survey, you must target adults above 18 or 21 years old to take your survey if there is an age restriction in your targeted market. 
  6. Sensitive questions and content 
    1. Surveys including sensitive content must meet the three following criteria:  
      1. Respondents must be adults above 18 years old. You can review your age targeting in the audience tab. 
      2. The survey must contain one initial screening question stating the study's content, for example, ‘the following survey contains questions regarding alcoholic behavior. Are you willing to take part in the following survey? Your responses are anonymous.” 
        1. Statement 1; Yes, I’m above the age of 18, and I’m giving you my consent. 
        2. Statement 2; No, I do not wish to take part. 
    2. Then, you must add one description question briefly explaining the content of the survey and why you want to know this information or how you will use it. 
  7. Multiple Open-Ended questions combined in one question 
    1. You must not combine multiple Open-Ended questions into one since it causes confusion and frustration to the respondents.
  8. Demographic questions
    1. We provide, by default, the demographic data of the respondents, so you don’t need to include questions to collect them. 
    2. If you choose to include demographic questions, we will not provide the relevant demographic data, to avoid any repetition, which might affect our respondents' experience. 
  9. A question with a topic that does not match its targeting
    1. When your question is aimed towards a specific demographic, you need to add Skip Logic to the previous question referencing the audience filters; this way, only the relevant audience will respond to this question, whereas the rest will skip it.
    2. Read more on how to implement advanced survey logic.
  10. Promotional content
    1. We do not allow any promotional content (e.g., website links, subscriptions, discounts, etc.) within the questionnaire. Pollfish is a market research platform and not a traffic provider.

Getting the Best Out of Surveys

Researchers should use a mobile-first online survey platform to get quality results at speed, as mobile dominates the digital space. No one wants to take surveys in a mobile environment that’s not adept for mobile devices.  

Your online survey platform should also offer artificial intelligence and machine learning to remove low-quality data, disqualify low-quality data and offer a broad range of survey and question types.

The survey platform should offer advanced skip logic to route respondents to relevant follow-up questions based on their previous answers. 

It should also allow you to survey any employee. As such, you’ll need a platform with a reach to millions of users, along with one that offers the Distribution Link feature. This feature will allow you to send your survey to specific respondents, instead of only deploying them across a vast network. 

With Pollfish, you’ll have all of these capabilities, along with having your surveys reviewed and examined by a wealth of experts.


How to Conduct a Survey in 5 Easy Steps

How to Conduct a Survey in 5 Easy Steps

If you need to conduct a survey for market research or any other kind of research, there are certain steps you should adhere to. Complying with these steps will ensure you form effective survey studies.  

Surveys are powerful tools for extracting virtually any kind of information. Whether you need to examine the behaviors of a particular population, perform market segmentation, find correlations between variables or discover attitudes towards your own brand, survey research makes it possible. 

This article instructs marketers, researchers and business owners on how to conduct a survey that brings value to your research endeavors in 5 steps. 

What Constitutes a Useful Survey

A survey must be business-savvy if you need primary research for a business, medically apt for medically-oriented research, socially-inclined for a social science study and so on. As such, each type of survey is unique to its study, macro application and specific type of survey research.

This is because surveys should never be used as lone tools. They work best when connected to a larger campaign. Thus, before you embark on survey research, you should consider what it is you need to study. 

If your research is specific to a vertical, you will need to conduct other research, such as secondary research before turning to surveys. Only then can you decide which gaps you need to fill. 

A useful survey fulfills a certain purpose; for some, it can be finding a correlation between a variable and a phenomenon, for which you would need to conduct descriptive research. Another research effort may require statistically valid information, for which you would need to conduct quantitative studies and a longitudinal survey.

All in all, a valuable survey is a survey generated through methodical means, as survey research involves a process.

How to Conduct a Survey 

Since conducting a survey is contingent on a methodology, researchers ought to understand how a general survey campaign must be conducted. The following explains how to run a survey campaign in five simple steps.

Step 1: Determine the Higher Levels of Your Survey Campaign 

Even if you have a target population you’d like to study, start from the top of your campaign. As aforementioned, you’ll need to attach your survey to a larger campaign, or macro application. 

For example, if you require a survey for market research, decide which subsector your study fits under. For example, it may belong to digital marketing, advertising, branding, PR, or competitive research — in the overarching campaign of market research. If you would like to study a social phenomenon, find a category under the social sciences into which the survey fits into.

Step 2: Filter Your High-Level Campaign Further By Considering General Inquiries

After attaching your survey to a larger campaign, group your campaign further by considering your general themes and inquiries. Identifying them will help you filter your specific survey beyond the macro and higher-up levels.

You’ll need to find the correct type of survey research and survey method to dictate your survey study. Regarding the former, you’ll need to determine if your survey requires descriptive, exploratory or causal research.

As for the latter, which has a greater focus on the length of a survey and its respondents, you can group your survey into retrospective, cross-sectional, longitudinal and prospective studies.

Since the latter methods involve frequency of deployment, you can better understand whether you’ll need to conduct one or several surveys for your study. Since surveys are best kept short for more responses and quicker data collection, your study will likely need more than one survey. 

This is especially true if you opt for retrospective or longitudinal surveys, which can last for weeks to years.

Additionally, the length of your study will also give you an understanding of how many questions you’ll need and how to thematically divide your survey.

Step 3: Create a Specific Campaign and Generate Questions

Now that you’ve found the type of survey research to undertake and tied it to a method, you’ll need to go back to step one (so to speak). 

From the general inquiries you’ve put together in step two, attach them to a specific campaign under your macro application. For example, say you’re conducting market research and require a study on advertising. You’ll need to attach your survey to an existing campaign under your advertising department or create a new campaign within it. 

In this case, consider your advertising needs and general inquiries. Do they revolve around a new ad campaign? Specifically, what exactly do you need to research? Perhaps you would like to learn how your target market responds to different messaging, imagery and promotions. If so, create a campaign centered on it. 

Once you’ve chosen or formed a specific campaign within a higher-level campaign (in this case, advertising), take your general inquiries and use them to form specific questions. 

Step 4: Put Together Your Screener and Questionnaire

Step 4 revolves around the survey itself, which requires a dual approach: forming the screener (the screening questions and demographic quotas) and the questions themselves, i.e., the questionnaire. 

You may need to conduct secondary research to understand the kind of people who constitute your target market or target population. Or you can conduct market segmentation to learn about the specific segments of your target market, as mentioned in the introduction.

Once you know exactly who to target the survey to, input them into the screener section via demographics, geolocation and other audience criteria. Add screening questions for a more precise targeting. With these questions, you can permit respondents with specific answers to take part in your survey as well as prohibit certain ones based on their answers.

For the question portion, add the questions you’ve come up with in Step 3. As you input them into the online survey tool of your choice, consider how you can make them better. Some questions will warrant an “other option,” while others will require a follow-up question (but only to certain answers.”

Example: “Which of the following brands have you heard of?” (A multiple-choice, multiple selection question)

If the respondent chooses one (or more) of the answers you provide, they should be routed to a follow-up question about the brand(s) they selected. This can be done via skip logic.

But, if they select “none of the above,” they shouldn’t be routed to a question about any of the brands, but rather to another question. Or if your survey is dependent on knowledge of the brands you provided, you can consider ending the survey for this respondent.

There are of course, plenty of other question considerations to take when creating insightful market research questions. For example, some questions may require a scale, such as the Net Promoter Score or the Likert scale

The former is appropriate to use in customer satisfaction and customer loyalty surveys, while the latter can be used in virtually any survey that requires answers with variations in, say, agreement with a statement. In such a case, the answers would include a scale with “strongly disagree” and “strongly agree” at opposite ends of the scale.

Step 5: Launch, Analyze & Take Action Post-Survey

When you’ve decided on how you’re going to organize your survey based on your questions and settled on all the necessary questions, you’re almost at the launching point.

First, review your survey. Make sure there are no technical glitches (ex: if you’ve added media files to questions) or typos. 

Before launching the survey, consider adding survey incentives. These will assure you receive your results sooner. They will also posit your brand in a favorable light if your survey mentions it explicitly. 

After launching the survey and receiving all the preset amount of completed surveys, it’s time to analyze survey data. There are a number of ways to go about this, from deciding on a data presentation mode(i.e., as cross-tabs, in charts, graphs, spreadsheets, etc.), to searching for statistical significance, to finding patterns and correlations. 

With the correct survey data analysis, you’ll be able to put your data to use effectively and make informed decisions. After analyzing your survey results, you should amass key findings into another document and organize it into the nature of your findings.

Then, you’re in the final stage: post-survey mode. At this final juncture, you should consider what to do from the insights you’ve gathered. Perhaps it’s time to take action, or you may need more information, for which, you can create another survey.

If you decide to take action, consult with your team on the most logical course of action to take, whether it requires price markdowns, different approaches to improving CX, or providing customers with additional information or anything else to improve your business or research needs.

Mastering Survey Research for all Campaigns

To master survey research campaigns, you ought to perform research (usually secondary) on the research sphere itself, which is steadily innovating. Additionally, you need to have one secret weapon in tow: a strong online survey platform. 

Through this platform, you’ll be able to conduct a survey — from creation to the launch phase to the analysis — with ease. A proper survey platform provides all the necessary interfaces, capabilities and quality checks to power your survey research.

Frequently asked questions

What are some useful points for conducting a survey?

A useful survey is never used as a lone tool. It always accompanies a bigger campaign. A useful survey fulfills a specific purpose - it can be finding a link between a variable and a phenomenon, for which you would need to conduct descriptive research. If your research is specific to a topic, you might need to conduct other research, such as secondary research, before turning to surveys.

What makes for a good survey?

Keeping survey questions neutral, having close-ended questions, a balanced set of answer choices, and diversifying option standards (having a rating scale of multiple-choice questions) are among some of the fundamental ways of making your survey interesting and valuable.

What constitutes an effective questionnaire design when conducting a survey?

When creating a questionnaire, the questions should be as specific to the target customers as possible. They should use basic wording and always have the option of 'other' to have more avenues for research. However, to skip all this hassle, use a market research survey platform to provide you with the data you need.

What is your survey launching process?

Once we have all the questions we want to include in the survey, we review them for any technical glitches. We also add survey incentives to encourage people to fill out these surveys and then launch these surveys on our target market's choice of social media platform.

How do you analyze survey data?

We analyze data sets in different ways. For example, sometimes our researchers organize all the collected data in tabs, graphs, and charts to find common patterns and correlations or sometimes to look for statistical significance.