New Hit: Segment and label the objects in this image (Tool fixed)

James Hennessey

New Member
Requester
Joined
Aug 2, 2018
Messages
5
Reaction score
1
Points
1
Age
31
Hello!

I'm new to MTurk as a requester and looking to get some MTurk workers to pick up my hits. I'm still trying to improve my image annotation tool and work out our best to balance reward and worker qualifications.

This is my hit, please search "Segment and label the objects in this image (Tool fixed)"

Title: Segment and label the objects in this image (Tool fixed)
Description: Please look at this image, outline any objects that belong to one of our target object classed and select the object's class.
Reward: 0.08
Assignment Duration: 1 hour
Auto Approval Delay: 2 days
Qualification 1: user must have > 75% approval rate
Qualification 2: user must have submitted > 500 assignment

I had a mistake when first posting and so only attempt the job with (Tool Fixed) in the title. I can't work out how to delete the previous one as I over wrote one of the files with HIT ids ...noob.

Any feedback much appreciated - thanks!
 

A6_Foul_Out

I only use Organic TurkerView reviews
Contributor
HIT Poster
Joined
Aug 17, 2016
Messages
11,754
Reaction score
21,657
Points
1,538
Age
25
Location
CT's Attic
Gender
Male
Hello!

I'm new to MTurk as a requester and looking to get some MTurk workers to pick up my hits. I'm still trying to improve my image annotation tool and work out our best to balance reward and worker qualifications.

This is my hit, please search "Segment and label the objects in this image (Tool fixed)"

Title: Segment and label the objects in this image (Tool fixed)
Description: Please look at this image, outline any objects that belong to one of our target object classed and select the object's class.
Reward: 0.08
Assignment Duration: 1 hour
Auto Approval Delay: 2 days
Qualification 1: user must have > 75% approval rate
Qualification 2: user must have submitted > 500 assignment

I had a mistake when first posting and so only attempt the job with (Tool Fixed) in the title. I can't work out how to delete the previous one as I over wrote one of the files with HIT ids ...noob.

Any feedback much appreciated - thanks!
You're using the wrong qual on your hits and nobody can currently accept them:



"Hit Submission Rate" isn't what you're looking to use.

You probably meant to use "Total Hits approved is greater than 500"
 

James Hennessey

New Member
Requester
Joined
Aug 2, 2018
Messages
5
Reaction score
1
Points
1
Age
31
Thank you! I have updated the qualification and you should be able to do it now. Thanks! :)
 

James Hennessey

New Member
Requester
Joined
Aug 2, 2018
Messages
5
Reaction score
1
Points
1
Age
31
A bunch of new image annotation HITS posted. We testing with annotating a whole image for $0.80 or single polygon annotations for $0.01. Feedback appreciated!
 

A6_Foul_Out

I only use Organic TurkerView reviews
Contributor
HIT Poster
Joined
Aug 17, 2016
Messages
11,754
Reaction score
21,657
Points
1,538
Age
25
Location
CT's Attic
Gender
Male
TLDR: Mechanical Turk forums are typically for American workers who expect American wages. It's clear that your task is either not in need of American workers or you're not willing to pay american wages. These forums aren't the best place to reach the type of workers you're looking for.

https://turkerview.com/requesters/?id=A1TXARRSMZ2JC9

$1.80 / hour
00:00:20 / completion time

Cons

Its image annotation at a penny, that's always going to be difficult to make worth it but the poor layout, design, and server resources behind the HIT make it pretty much impossible.

The obvious issue with a penny HIT & images is that if the image loads slowly (which these did) you're already off to a rough start. Requester has them loaded on his crappy private server instead of allowing AWS to handle it for him & deliver content to workers quickly.

You'd think only having to annotate one object would make these somewhat okayish, even passable except that aside from the load times the images are also shared between all workers. So if you get an image that has all or most objects annotated, you're left twiddling your thumbs with nothing to really do or wasting time looking around for something to do. Some images are completely covered by annotation boxes and if you click one it throws an annoying dialog about how you're not allowed to edit someone else's work.

And to top it off the dropbox you have to use once closing in the object annotation to select what kind of object you're labeling is of course clunky and slows down the submission time considerably. It would be nice if the Requester split these up into "annotate doorways", "annotate mirrors" etc so that the dropdown wasn't necessary but that isn't the case for the HITs when I gave them a test run.
Pros

You only have to annotate one object.
Perhaps this penny will be distributed heads up so its lucky?
 

James Hennessey

New Member
Requester
Joined
Aug 2, 2018
Messages
5
Reaction score
1
Points
1
Age
31
Thansk for the feedback. We're still experimenting with the different options and trying to work out the best way to get our data annotated. I'll post again on our next iteration to see if we do any better.