Using data shared by some of the most experienced workers on MTurk users can gain insights into HITs that pay well and are safe to work on. Our users maintain some of the highest standards on the platform, most holding over a 99.5% approval rate, meaning HITs that appear here will pay out when completed properly.
This refers to the original hit titled "Compare the speech sound quality of audio samples" - headphones are needed to work on this hit - you are taken through a system check - first task involves listening to audio and typing out what was said - the next page displays what you typed grade as per the instructions - work through 3 blocks of 30 trials - in the second half work through pairs of audio clicking on the one you think is best - there are 5 blocks of 30 trials in this part - this could be done faster although I do want to state (although I am recorded as being a slow worker) I really did work at a faster rate than I usually do - fair pay
I started work on this with an hour timer left on MTurk and I ran out of time. I wrote to the requester who answered me straight away and stated I could complete the hit and they would set up a compensation hit for me. Fantastic requester and I am so grateful they sorted the hit out for me!!
TurkerView is designed to bridge the gap between workers & requesters through data & communication.
This is fairly straightforward: we take the completion time & the reward amount (where available) and calculate the average hourly rate for the task. We then apply that number to a simple range based on US minimum wage standards to color-code the data for easy to digest numerical data.
Color | Pay Range (Hourly) | Explanation |
---|---|---|
RED | < $7.25 / hr | Hourly averages below US Federal minimum wage |
ORANGE | $7.25 - $10.00 / hr | Hourly averages between Federal & highest statewide (CA) minimum wages. |
GREEN | > $10.00 / hr | Hourly averages above all US minimum wage standards |
Not all HITs are created equal. Sometimes an hourly wage doesn't convey the full story of a HIT's true worth, so we encourage workers to give their opinion on the overall pay of the task. Was it $8/hr to rate pictures of puppies? A worker could justifiably bump up the rating a bit for something so adorable. 10 hours locked in Inquisit? Even for $10/hr many workers would appreciate the heads up on such a task. The Pay Sentiment rating helps connect workers beyond the hard data.
Icon | Rating | Suggested Guidelines |
---|---|---|
Underpaid 1 / 5 |
|
|
Low 2 / 5 |
|
|
Fair 3 / 5 |
|
|
Good 4 / 5 |
|
|
Generous 5 / 5 |
|
Communication is an underrated aspect of mTurk. Clear, concise directions. A fast response to a clarification question or a resolution to a workflow suggestion can all be valuable aspects of interaction between Requesters & Workers and its worth keeping track of. Plus everyone enjoys the peace of mind knowing that if something does go wrong there will be an actual human getting back to you to solve the issue.
Icon | Rating | Suggested Guidelines |
---|---|---|
Unacceptable 1 / 5 |
|
|
Poor 2 / 5 |
|
|
Acceptable 3 / 5 |
|
|
Good 4 / 5 |
|
|
Excellent 5 / 5 |
|
This rating is strictly for approval times. Let's face it, no one wants to mix approval time ratings with how fast a Requester rejects a HIT, so we've saved rejection flags for another category. This provides a more straightforward way to know about how long your HIT might sit pending before paying out. The default auto-approval for most MTurk tasks is 3 days, the maximum is 30 days. We've tried to base our ratings around those data-points.
Icon | Rating | Approval Time |
---|---|---|
Very Slow 1 / 5 | Over 2 weeks | |
Slow 2 / 5 | ~1 - 2 Weeks | |
Average 3 / 5 | ~3 - 7 Days | |
Fast 4 / 5 | ~1 - 3 Days | |
Very Fast 5 / 5 | ~24 hours or less |
TurkerViewJS is the engine behind TurkerView. An efficient collection process combined with a user-friendly interface encourages more frequent worker input & allows for the refinement of aggregate data in real time.
Our API also allows users access to real-time data about HITs and requesters. Users can feel confident with the knowledge that our platform has vetted thousands of requesters who treat workers fairly.
Unique Requesters
have been reviewed by users on TurkerView
Individual Reviews
are available to TurkerView users
Awesome Users
and counting are part of the TurkerView community