A commonly cited maxim states that “you get what you pay for”, implying that there is a strong correlation between the price paid for something, and its quality. In this paper, we examine whether this often-cited wisdom applies to using crowdsourcing for conducting subjective QoE experiments, and if so, how. As part of a large-scale user study designed to assess Web QoE, we conducted two crowdsourced campaigns to collect user ratings and study the influence of certain website design parameters related to typography and color on the overall visual appeal of the site. While the test content was exactly the same across both campaigns, the second campaign was set up to pay participants three times the reward of the first one. The goal was to analyze the impact of payment on a number of parameters, including the ratio of reliable users and obtained MOS values. With respect to QoE modeling, we found that while payment levels influenced absolute MOS values, there was no significant impact on the actual model.