Using data shared by some of the most experienced workers on MTurk users can gain insights into HITs that pay well and are safe to work on. Our users maintain some of the highest standards on the platform, most holding over a 99.5% approval rate, meaning HITs that appear here will pay out when completed properly.
Get paid fairly: This requester is using the Fair Work script to bring pay rates up to the minimum wage of $15/hr. Fair Work does this by measuring completion times and then auto-bonusing workers to meet the desired hourly wage if needed.
Please report how much active work time this task took you, rounded to the nearest half minute. If you are far off from the median time report across all workers for this task, indicating a lack of good-faith estimation, you may be removed from Fair Work bonus eligibility. Bonuses are sent out daily. By participating, you acknowledge and consent to the Fair Work IRB.
Fair Work is a research project run by the Stanford HCI Group.
Description: Fair Work is a research system focused on fair pay for online workers. You can participate by completing this HIT and filling out the time self-report above. Worker reponses will be aggregated and the system will automatically bonus workers to ensure that they earned minimum wage. It will log items such as worker time self-reports. Data will not be shared with anyone outside our research team. You can choose not to participate by not filling out the form. The system will still log that you completed the task so that you can receive a bonus. However, if no workers fill out the form, no bonuses will be sent because the system cannot estimate work time for the task.
Time Involvement: You can participate for as long as you want by filling out the form above on any HITs you encounter with it installed.
Risks and Benefits: Fair Work will supply bonuses based on aggregate worker reports. Beyond this, we cannot and do not guarantee or promise that you will receive any benefits from using Fair Work.
Subject's Rights: If you have read this form and have decided to participate in this project, please understand your participation is voluntary and you have the right to withdraw your consent or discontinue participation at any time without penalty or loss of benefits to which you are otherwise entitled. The alternative is not to participate by filling out the form, or not completing this HIT. Your individual privacy will be maintained in all published and written data resulting from the study.
Contact Information: If you have any questions, concerns or complaints about this research, its procedures, risks and benefits, contact the Fair Work team at fairwork@cs.stanford.edu.
Independent Contact: If you are not satisfied with how this study is being conducted, or if you have any concerns, complaints, or general questions about the research or your rights as a participant, please contact the Stanford Institutional Review Board (IRB) to speak to someone independent of the research team at (650)-723-2480 or toll free at 1-866-680-2906. You can also write to the Stanford IRB, Stanford University, 3000 El Camino Real, Five Palo Alto Square, 4th Floor, Palo Alto, CA 94306.
You may print or save a copy of this page for your records.
TurkerView is designed to bridge the gap between workers & requesters through data & communication.
This is fairly straightforward: we take the completion time & the reward amount (where available) and calculate the average hourly rate for the task. We then apply that number to a simple range based on US minimum wage standards to color-code the data for easy to digest numerical data.
Color | Pay Range (Hourly) | Explanation |
---|---|---|
RED | < $7.25 / hr | Hourly averages below US Federal minimum wage |
ORANGE | $7.25 - $10.00 / hr | Hourly averages between Federal & highest statewide (CA) minimum wages. |
GREEN | > $10.00 / hr | Hourly averages above all US minimum wage standards |
Not all HITs are created equal. Sometimes an hourly wage doesn't convey the full story of a HIT's true worth, so we encourage workers to give their opinion on the overall pay of the task. Was it $8/hr to rate pictures of puppies? A worker could justifiably bump up the rating a bit for something so adorable. 10 hours locked in Inquisit? Even for $10/hr many workers would appreciate the heads up on such a task. The Pay Sentiment rating helps connect workers beyond the hard data.
Icon | Rating | Suggested Guidelines |
---|---|---|
Underpaid 1 / 5 |
|
|
Low 2 / 5 |
|
|
Fair 3 / 5 |
|
|
Good 4 / 5 |
|
|
Generous 5 / 5 |
|
Communication is an underrated aspect of mTurk. Clear, concise directions. A fast response to a clarification question or a resolution to a workflow suggestion can all be valuable aspects of interaction between Requesters & Workers and its worth keeping track of. Plus everyone enjoys the peace of mind knowing that if something does go wrong there will be an actual human getting back to you to solve the issue.
Icon | Rating | Suggested Guidelines |
---|---|---|
Unacceptable 1 / 5 |
|
|
Poor 2 / 5 |
|
|
Acceptable 3 / 5 |
|
|
Good 4 / 5 |
|
|
Excellent 5 / 5 |
|
This rating is strictly for approval times. Let's face it, no one wants to mix approval time ratings with how fast a Requester rejects a HIT, so we've saved rejection flags for another category. This provides a more straightforward way to know about how long your HIT might sit pending before paying out. The default auto-approval for most MTurk tasks is 3 days, the maximum is 30 days. We've tried to base our ratings around those data-points.
Icon | Rating | Approval Time |
---|---|---|
Very Slow 1 / 5 | Over 2 weeks | |
Slow 2 / 5 | ~1 - 2 Weeks | |
Average 3 / 5 | ~3 - 7 Days | |
Fast 4 / 5 | ~1 - 3 Days | |
Very Fast 5 / 5 | ~24 hours or less |
TurkerViewJS is the engine behind TurkerView. An efficient collection process combined with a user-friendly interface encourages more frequent worker input & allows for the refinement of aggregate data in real time.
Our API also allows users access to real-time data about HITs and requesters. Users can feel confident with the knowledge that our platform has vetted thousands of requesters who treat workers fairly.
Unique Requesters
have been reviewed by users on TurkerView
Individual Reviews
are available to TurkerView users
Awesome Users
and counting are part of the TurkerView community