Panda Crazy Script for Panda's

ScrapingForQuarters

New Member
Joined
Mar 10, 2016
Messages
32
Reaction score
0
Points
256
Age
70
Location
Chicago Area
Gender
Male
Could you add detail on mmmturkeybacon Q Order Fix ...

is that 2 DB "hits" in addition to the one for the Submit (IS there one for the submit too that counts aginst the "1 second rule?")

Thanks

Is it a read of the Q and then a "rewrite" of the Q?

If not, what are they and why are their 2?
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
I updated all 3 scripts.

Didn't notice before the accept, but the tab title after I hit the Accept Key is very long and practically useless.
It shows (1/1) <a href-"http .....
You probably have a script that changes the requester name to a link. I am not running a script that does it so I didn't realize that would happen. It's only supposed to grab the requester name. I'll fix it in the next update.

Could you add detail on mmmturkeybacon Q Order Fix ...
is that 2 DB "hits" in addition to the one for the Submit (IS there one for the submit too that counts aginst the "1 second rule?")
Thanks
Is it a read of the Q and then a "rewrite" of the Q?
If not, what are they and why are their 2?
Not exactly sure what you are asking here. I would suggest to disable the mmmturkeybacon Q Order Fix if running the PC queue helper. They could interfere with each other. The PC queue helper just gets the queue info from PC instead of getting it from mturk which the older script does. The older script sends 2 requests to mturk after a submit which causes Pre's if submitting too fast. PC updates the queue every 40 seconds and adds hits to the queue watch when a hit is accepted or removes it when a hit gets submitted or returned so the queue info is close to accurate as possible.
 

ScrapingForQuarters

New Member
Joined
Mar 10, 2016
Messages
32
Reaction score
0
Points
256
Age
70
Location
Chicago Area
Gender
Male
Y

Not exactly sure what you are asking here.
I know all that ... I read everything you post and even look at the History on the script pages for how things change. I'm a retired computer programmer so the details of these things interest me more than many of your users.

I was merely asking some details (guts) of the Q Order Fix script and how you knew it did these extra DB "inteferring" accesses and what their purpose was. I "evangelize" your 3 scripts quite often elsewhere and talked about that recently in a posting describing your advantages, but I wanted more details.

If it matters, here's a link to my TM scripts in Chrome:
alphabatized: https://drive.google.com/open?id=0B8L21NA8BTg1aVZhQ1E0V3BOYTA

most recently updated: https://drive.google.com/open?id=0B8L21NA8BTg1YUxqTDdEOUY3aTA
 
Last edited:

ikarma

New Member
Contributor
Joined
Jan 12, 2016
Messages
48
Reaction score
105
Points
358
Age
51
Gender
Male
Multiple tabs working pretty good. I do get an error that I have already worked on the hit pretty often. I can eliminate the problem by setting tab to load 2nd hit in queue but that leaves one hit in queue not worked for a while.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
I know all that ... I read everything you post and even look at the History on the script pages for how things change. I'm a retired computer programmer so the details of these things interest me more than many of your users.

I was merely asking some details (guts) of the Q Order Fix script and how you knew it did these extra DB "inteferring" accesses and what their purpose was. I "evangelize" your 3 scripts quite often elsewhere and talked about that recently in a posting describing your advantages, but I wanted more details.
Ok, I just assume people don't know programming so I can give an understandable answer. In the Q order fix script right after the page gets loaded it sends an ajax request to mturk for the queue. When that script was created years ago there was no problem because the request limit for mturk was lower so pre's weren't being sent as much. The problem comes in because mturk gets the submit request and then the script instantly sends the queue request. Usually those two requests are far enough apart that it won't run into a Pre problem for normal speed hits. But when a batch that has hits which can be done in a few seconds then all of those requests to mturk start to add up. Mturk will then send Pre's until the requests slow down. Now the Q order could be fixed with a settimeout for a second or two so it doesn't send it instantly. But if you are submitting hits fast then the script won't have the time to grab the current queue data and change the next url variable. This will have the next hit be whatever mturk thinks it should be which could be wrong. Rarely will it have to send a third request because some hits don't have an iframe and doesn't have the next url variable. With those rare hits the script has to redirect to the next hit so it could be sending 3 requests too close together. But I haven't found a hit like that in a long time.

So my Queue helper script doesn't send any requests to mturk. After submitting a page it sends a message to main PC script and then the main script sends the queue data that it has which is current and accurate enough. These messages are done in ms so it should be faster than a human can submit. The Queue helper script also sends a message when submitting or returning a hit so the main script removes them from the queue data without sending a request to mturk. Every 40 seconds the main script gets the real queue data from mturk so that's enough of a time so it shouldn't cause too many pre's.

Hope that explains it.
 
  • Like
Reactions: ScrapingForQuarters

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Multiple tabs working pretty good. I do get an error that I have already worked on the hit pretty often. I can eliminate the problem by setting tab to load 2nd hit in queue but that leaves one hit in queue not worked for a while.
Yes the queue data isn't being sent fast enough when people are submitting so fast. I do have 3 ideas to try and will have an update today with the first idea. It seems to be a lot faster because it will ask for the queue data right at page load instead of waiting for all the images to load in page.

One way of fixing this problem right now is to set the first tab to the first position in the options window by the Q button on the right side. 2nd tab to the 3rd position. 3rd tab to the 5th position. So just skipping 1 position should allow it to catch up.
 

ikarma

New Member
Contributor
Joined
Jan 12, 2016
Messages
48
Reaction score
105
Points
358
Age
51
Gender
Male
Yes the queue data isn't being sent fast enough when people are submitting so fast. I do have 3 ideas to try and will have an update today with the first idea. It seems to be a lot faster because it will ask for the queue data right at page load instead of waiting for all the images to load in page.

One way of fixing this problem right now is to set the first tab to the first position in the options window by the Q button on the right side. 2nd tab to the 3rd position. 3rd tab to the 5th position. So just skipping 1 position should allow it to catch up.

Yeah that works pretty good that way when I have a lot of hits coming in. When it is a little slower I have been setting it to first hit in one tab and last hit in another tab and I get the error. I might try a minimum of 3 tabs to let it catch up. I can't wait until some huge batches show up so I can work in a bunch of tabs again.

Great work!!!
 

webgyrl

Learner Turk
Joined
Sep 18, 2016
Messages
29
Reaction score
48
Points
213
Age
38
Location
Maryland
Gender
Female
The script is controlled from the window that is running on https://www.mturk.com/mturk/welcome?pandacrazy=on
The collect buttons is what turns them on and off individually. You don't want too many panda's collecting at once because it has to check each one slowly. I recommend not going over 10 jobs collecting. If you need to limit the number of hits collected you can click on the details button for that job and set a number for the Queue Hit Limit option. The Panda Crazy helper is just an add on to show the buttons for easy adding of Panda's instead of clicking on the Add job button and pasting a panda url in there. So keep your eye on the main script. It shows your queue on the bottom too so it's convenient to watch it.
Thanks so much! I am kinda getting the hang of it!

Do you know what's going to happen with the scripts when Amazon rolls out the new Worker website/dashboard?
 

Cass

Happy Turker
Joined
Sep 16, 2016
Messages
27
Reaction score
28
Points
13
Location
Michigan
Gender
Female
Is there a way to delete all your pandas and start over?
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Is there a way to delete all your pandas and start over?
Yes there are two ways. If you just want to delete all the panda's and not any of your options you can click on the Search Jobs button at the top. There is a checkbox at the very top which will select all jobs. Click on the Delete Selected button at top and it should give you a prompt to delete it.

The other way is to delete all the data by using the reset url I mentioned in previous posts in here. It will delete everything and start with all the default data, alarms and options so it really is only good in bad situations.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Do you know what's going to happen with the scripts when Amazon rolls out the new Worker website/dashboard?
I assume many scripts will stop working but if the original scripter is active they might update the script. For other scripts it will take time for other scripters to either change the original script or make their own. I have plans to change my scripts to the new site when they convert some more pages over.
 
  • Like
Reactions: webgyrl

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Today I released a new update (0.3.16) to greasyfork. Be sure to update it if greasemonkey/tampermonkey doesn't update it automatically. Should update it automatically sometime today. Be sure to backup your data by exporting it from the jobs menu at top so you won't lose anything if something goes wrong. This fixes a memory problem so it could be a good thing to update to this version if having any slow down problems.

New Features:
  • Two more stats are added for PRE'S. It now counts pre's it gets in ham mode and out of ham mode. These stats can be seen by clicking on the PRE'S text at the top. Pre's from ham mode is expected so it's not as important as the pre's it gets in normal mode.
Bugs Fixed:
  • Queue hit limit for jobs are more responsive and more accurate.
  • A big memory problem was found and now fixed. If you found the script slowing down or not getting a lot of hits when leaving the script running for hours or days than this should fix it. I recommend to restart the script every day any way. Best way to restart in chrome specifically is to close the tab or window PC is running. Then make a new window and start PC so it clears out all the memory that tab or window was running. Chrome doesn't clear out that memory as much as it should.Technical: The memory problem was coming from tooltips in the Jquery UI library. It wasn't removing the div tooltip message it was creating so it would just add up until the script would slow down too much. The tooltips are supposed to do that to be compatible to all browsers but I didn't realize that. I found a work around by destroying the tooltip after the mouse moves away and then recreating the tooltip.
Hope this memory problem didn't cause any problems. I noticed it when I was collecting a fast big batch and saw I was doing the hits in my queue faster than the script was filling the queue which wasn't supposed to be happening with the fast timer I had set.
 

webgyrl

Learner Turk
Joined
Sep 18, 2016
Messages
29
Reaction score
48
Points
213
Age
38
Location
Maryland
Gender
Female
I assume many scripts will stop working but if the original scripter is active they might update the script. For other scripts it will take time for other scripters to either change the original script or make their own. I have plans to change my scripts to the new site when they convert some more pages over.
You, sir, are awesome. Thank you :)
 

Yatagarasu

all up in the videos, all on the records, dancin'
Contributor
Crowd Pleaser
HIT Poster
Joined
Jan 13, 2016
Messages
10,012
Reaction score
13,638
Points
2,238
Age
36
Gender
Female
Been acting weird for me.

(1) I've having the issue again where I need to delete unnamed pandas individually.

(2) A moment ago I restarted and there were duplicates of certain pandas. I deleted them. Upon refreshing again, the dupes and originals were gone (which I guess it was just some sort of visual error and the dupes weren't actually there??). It was at least only 3, all of which can be easily grabbed from the work threads.
 

Tigerpants

Murderer of Threads
Contributor
Joined
Apr 1, 2016
Messages
9,378
Reaction score
7,979
Points
1,038
Age
31
Location
Center of hell
Gender
Female
Feature Request:

I would love a little 'date/time added' thing on the details. So I can know how long I've been trying to catch something easily and decide if I want to keep trying or not.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Been acting weird for me.

(1) I've having the issue again where I need to delete unnamed pandas individually.

(2) A moment ago I restarted and there were duplicates of certain pandas. I deleted them. Upon refreshing again, the dupes and originals were gone (which I guess it was just some sort of visual error and the dupes weren't actually there??). It was at least only 3, all of which can be easily grabbed from the work threads.
Have you tried exporting the jobs and then importing the jobs back in? That might clear out any problem data that could be causing these issues. I'll recheck the script to see if I could find something that might cause it. So when you list all the jobs it the unnamed pandas do not show up? Did you add them with a group id number and use them as a panda or a search mode? Did you delete any of the data in the details area after adding them? Maybe I can pinpoint the problem if I could get a way to duplicate it on my system.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Feature Request:

I would love a little 'date/time added' thing on the details. So I can know how long I've been trying to catch something easily and decide if I want to keep trying or not.
Good idea. Maybe a better place would be in the job listing so it's easier to see all the jobs being collecting and show how much time it's been running. Then turning off a job would be easier too.
 
  • Like
Reactions: Tigerpants

Yatagarasu

all up in the videos, all on the records, dancin'
Contributor
Crowd Pleaser
HIT Poster
Joined
Jan 13, 2016
Messages
10,012
Reaction score
13,638
Points
2,238
Age
36
Gender
Female

Have you tried exporting the jobs and then importing the jobs back in?
No, but I'll try that to be on the safe side.

So when you list all the jobs it the unnamed pandas do not show up? Correct.

Did you add them with a group id number and use them as a panda or a search mode? Simply used the helper.

Did you delete any of the data in the details area after adding them? Nope. I usually don't touch anything there unless I'm writing in a friendly requester name.
 

Johnnyrs

Active Member
Contributor
Crowd Pleaser
Joined
Jan 12, 2016
Messages
546
Reaction score
1,603
Points
668
Age
53
Location
Whittier, California
Gender
Male
Today I released a new update (0.3.17) to greasyfork. Be sure to update it if greasemonkey/tampermonkey doesn't update it automatically. Should update it automatically sometime today. Be sure to backup your data by exporting it from the jobs menu at top so you won't lose anything if something goes wrong.

New Features:
  • Added a delete button on each job that looks like an X to make it easier for people to delete jobs.
Bugs Fixed:
  • Script should load up faster for people who have over 100 jobs. It will still be slow when listing all the jobs.
  • May have fixed a problem when listing jobs and not all of them shows up. If this problem still occurs I can fix it easier if someone could go into their javascript console and report any errors that come up when displaying the jobs. Also the next job after the last one is probably the reason why it's not listing all of the jobs so finding it and deleting it may fix it.
  • May fix a problem for people using the queue helper script where it doesn't always go to the next hit in queue.
  • Cleaned up some code to iterate over objects a bit faster.
I do recommend not having 100's of jobs because it can cause slowness but I'm still trying to find ways to speed things up.
 
  • Like
Reactions: Yatagarasu

groundturk

New Member
Joined
Jun 16, 2016
Messages
8
Reaction score
6
Points
203
Age
48
Gender
Male
This has become my new favorite script. I do have a feature request, if you don't mind. I like to check the 'Accepted' tab from time to time so see if anything came in and was missed (I frequently have to step away from my computer). Would it be possible to add a date / time stamp to the list of HITs that were accepted?