Using data shared by some of the most experienced workers on MTurk users can gain insights into HITs that pay well and are safe to work on. Our users maintain some of the highest standards on the platform, most holding over a 99.5% approval rate, meaning HITs that appear here will pay out when completed properly.
SLOW DOWN ON THE CODE GENERATION PAGE! READ THIS FIRST:
The code is generated on a BUTTON that cannot be copy/pasted from. I have NO CLUE what my clipboard was prior to clicking it, but I THINK it might auto-generate a code and set your computers clipboard text to the code value. I have NO CLUE if my guess is correct or not but my clipboard was a random # I don't remember copying after clicking the button.
The directions on the code page are pretty lack luster as far as how this goes, so I manually typed out the code generated ON THE BUTTON TEXT (which was different than what was on my clipboard). I have no clue which (if either) is actually correct, but if you read this before doing the survey check the clipboard before clicking the button & see if clicking the button changes what was on your clipboard because if it does that is probably the correct survey code. No clue, god speed, good luck.
TurkerView is designed to bridge the gap between workers & requesters through data & communication.
This is fairly straightforward: we take the completion time & the reward amount (where available) and calculate the average hourly rate for the task. We then apply that number to a simple range based on US minimum wage standards to color-code the data for easy to digest numerical data.
Color | Pay Range (Hourly) | Explanation |
---|---|---|
RED | < $7.25 / hr | Hourly averages below US Federal minimum wage |
ORANGE | $7.25 - $10.00 / hr | Hourly averages between Federal & highest statewide (CA) minimum wages. |
GREEN | > $10.00 / hr | Hourly averages above all US minimum wage standards |
Not all HITs are created equal. Sometimes an hourly wage doesn't convey the full story of a HIT's true worth, so we encourage workers to give their opinion on the overall pay of the task. Was it $8/hr to rate pictures of puppies? A worker could justifiably bump up the rating a bit for something so adorable. 10 hours locked in Inquisit? Even for $10/hr many workers would appreciate the heads up on such a task. The Pay Sentiment rating helps connect workers beyond the hard data.
Icon | Rating | Suggested Guidelines |
---|---|---|
Underpaid 1 / 5 |
|
|
Low 2 / 5 |
|
|
Fair 3 / 5 |
|
|
Good 4 / 5 |
|
|
Generous 5 / 5 |
|
Communication is an underrated aspect of mTurk. Clear, concise directions. A fast response to a clarification question or a resolution to a workflow suggestion can all be valuable aspects of interaction between Requesters & Workers and its worth keeping track of. Plus everyone enjoys the peace of mind knowing that if something does go wrong there will be an actual human getting back to you to solve the issue.
Icon | Rating | Suggested Guidelines |
---|---|---|
Unacceptable 1 / 5 |
|
|
Poor 2 / 5 |
|
|
Acceptable 3 / 5 |
|
|
Good 4 / 5 |
|
|
Excellent 5 / 5 |
|
This rating is strictly for approval times. Let's face it, no one wants to mix approval time ratings with how fast a Requester rejects a HIT, so we've saved rejection flags for another category. This provides a more straightforward way to know about how long your HIT might sit pending before paying out. The default auto-approval for most MTurk tasks is 3 days, the maximum is 30 days. We've tried to base our ratings around those data-points.
Icon | Rating | Approval Time |
---|---|---|
Very Slow 1 / 5 | Over 2 weeks | |
Slow 2 / 5 | ~1 - 2 Weeks | |
Average 3 / 5 | ~3 - 7 Days | |
Fast 4 / 5 | ~1 - 3 Days | |
Very Fast 5 / 5 | ~24 hours or less |
TurkerViewJS is the engine behind TurkerView. An efficient collection process combined with a user-friendly interface encourages more frequent worker input & allows for the refinement of aggregate data in real time.
Our API also allows users access to real-time data about HITs and requesters. Users can feel confident with the knowledge that our platform has vetted thousands of requesters who treat workers fairly.
Unique Requesters
have been reviewed by users on TurkerView
Individual Reviews
are available to TurkerView users
Awesome Users
and counting are part of the TurkerView community