OneSpace fka CrowdSource WorkStation

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
OneSpace history:
  • CrowdSource was a major MTurk requester from 2009 through July 2015, leaving because of the fee increases then. turkopticon : reviews about CrowdSource
  • Originally started as 'Midwest Internet'. Changed name to CrowdSource circa ~2011 (looks like the name was retroactively changed on all previous TO reviews at that time, unfortunately for historical purposes).
  • Also owns the domains Write.com and Transcribe.com through which to recruit potential workers and customers.
  • In Nov 2013, acquired another major writing-focused crowdsourcing company, called CloudCrowd / Servio
  • They now have their own 'workstation' platform on their own site for workers to use, at first alongside MTurk, now exclusively since July 2015. You get paid daily through a verified PayPal account.
  • CrowdSource changed their name to OneSpace on Nov 17, 2015; URLs changed from crowdsource.com to onespace.com .

The worker platform: http://work.onespace.com/

Their forum, registration required: http://forum.onespace.com/discussions

Their support section, with some general FAQs and a bunch of writing/editing resources: http://support.onespace.com/

Their YouTube channel: https://www.youtube.com/user/CrowdSourcecom/videos


A few userscripts to improve their worker interface: https://greasyfork.org/en/scripts/by-site/onespace.com?set=367


How they changed up their quals in June 2015
How to move from MTurk to WS-Paypal with your CS quals/stats intact (hopefully)
How the CrowdSource/OneSpace writing/editing system works (feedback, promotion/demotion, pass/fail/reject)
If you're curious which client companies some of this work is for
'2015 OneSpace Year in Review'
 
Last edited:
  • Like
Reactions: memphissoul

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Info from June 2015 about changes to CrowdSource/OneSpace's qual structure:

Biiiig shakeups coming in the world of CS quals. They're going to start having permanently-available qual tests of some sort, which they're calling 'courses', instead of temporarily-available qual HITs; and they're simplifying down to a lot fewer different quals, combining many of them. If you already have at least one qual in the group of old quals that becomes combined into a new qual, they say you will be grandfathered into the new qual (exempt from needing to take the new tests), and have access to an increased amount of types of work at that level. They haven't yet said anything about how they will implement these new qual tests on the mturk side. A limited group of CS workers are testing things now.

Being discussed at http://forum.crowdsource.com/discussion/2151/new-workstation-features and http://forum.crowdsource.com/discussion/2157/qualification-courses-feedback on the CS forum.

The planned qual structure revamping was posted as an image:


The five I'd been posting about the testing for over the last several months (Media Moderation, Data Collecting, Data Categorization, Data Tagging [clothing], and Product Matching) are going to be turned into just two, Quality Assurance Specialist and Senior Quality Assurance Specialist.

I don't think that table includes all the quals they still use, so there's probably some that are remaining as-is title-wise.

They also said they will be revoking more old unused quals from people over the next few weeks, so don't be alarmed.
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Everybody who cares has probably done this already months ago, but just in case it still helps someone, here's what I wrote in July 2015 about the process of changing over to their site when they stopped posting on MTurk. Anyone who hasn't already months-ago done the step of getting an invite to activate an account associated with your Amazon login will probably have to jump straight to 'beg for help from support@ onespace.com' now...

I just did the changeover. Notes about the process:

1. I already had an MTurk-linked account on their WorkStation (WS) website ( http://work.crowdsource.com ), which you can only get by being able to accept a CrowdSource HIT on MTurk and clicking a button that appears in the corner of it to generate an invite message sent to you through MTurk. Then clicking the link in that invite allows you to set up the login through Amazon (if you just go straight to WS and try to log in using Amazon without having an invite to activate, it will say 'you can't do that'). This may no longer be possible if you haven't already done it, as the dregs of CS's last available HITs on MTurk die off leading up to 7/22 paymageddon (there's already nothing still up from CS that I qualify for, if I didn't already have a WS account - check for yourself here), and I don't know if it would be possible for you to keep your quals/status with CS without that (otherwise you just create a WS account from scratch with Paypal only).

2. I don't know if this is mandatory, but I think it's a good thing to have; I did it a few months ago when it was mentioned on their forum: Set up an alternate login for your WS account, so you can sign in with a username (your email address) and a CS-specific password instead of signing in through Amazon. This alternate sign-in method will then persist across your payment method change. ("Open your Profile in WorkStation. Click Set Password. An email will be sent to the primary email address you provided. Follow the instructions in the email.")

3. Go to http://support.crowdsource.com/ , click the Submit A Request link up top, use your WS-associated email address, and tell them you need to switch from MTurk to Paypal. Some time later (originally I'd heard it was taking like a week for them to respond to these requests, but then someone said it only took like four hours this past weekend, and today I got a response in less than an hour!), you'll get an email reply. Alternatively, you can call CS toll-free at 855-276-9376. Apparently their office hours (for phone calls) are M-F 9am-6pm Central Time.

4. Then sign in on WS and you'll now be prompted to link a Paypal account. Click the link provided, sign into your Paypal account, and grant access to CS.

5. If your Paypal account isn't already in verified status with Paypal, you'll also need to jump through some hoops with them to get that done (similar to Amazon Payments), but mine has been for years. I think there was a CS forum post recently saying that all Paypal accounts must be verified by July 31, though I'm not sure whether newly-switching ones might be required to start off verified.


PS: Though most people aren't qualified for most of their work, CrowdSource's HITs often made up 1/4 of all the HITs on MTurk.
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
I originally wrote this in July 2015; I think they have at least improved the behavior of the date-filtering a bit since then...

Here's some lesser-known and hidden features of the 'Past Work' section of CS WS, which is still kinda lame/limited, but this might be somewhat helpful in some cases:


When you click on the 'Past Work' header link, it goes to https://work.crowdsource.com/history , which shows totals of everything you've ever done grouped by task title.

If you click on one task type in that list, it shows you a list of each individual task you did in that type (unfortunately in that eterna-scroll 'load more' format I hate without the option to go directly to specific page numbers), and you can click on one of those tasks see exactly what you submitted on it.

When on that ^ list page, if you remove click the x to remove the filter criteria for that task type, or go straight to the hidden url https://work.crowdsource.com/submissions , then you see a list of each individual task you did of *every* type. Unfortunately you don't see what type they were that way.


When on the main 'Past Work' page https://work.crowdsource.com/history , there is a feature in the left-hand column to let you apply 'before' and 'after' date range filters. This adds stuff to the URL query string that looks like this, for example: ?after=2015-06-28&before=2015-07-29 . It can be used with just 'before' ( ?before=2015-07-29 ), just 'after' ( ?after=2015-06-28 ), or both together like that.

When on the Submissions page, there isn't an interface given for the date filters, but someone pointed out on the CS forum today that you can still add values like that to the query string manually, for example https://work.crowdsource.com/submissions?after=2015-06-28&before=2015-07-29 , and it will work (in the same buggy way it ever works, described below).

The dates specified in the query string currently will result in filter settings displayed on the page that are 1 day earlier from what you thought you were asking for. So the above example URL turns into "After Jun 27, 2015 X" and "Before Jul 28, 2015 X". And it seems to be even buggier than that - the effect it actually has on the results isn't what it should be, based on either the query string values or the day-earlier displayed filter settings, and regardless of whether interpreting the stated dates inclusively or exclusively. So it is currently impossible to narrow down to a single day's totals or tasks. :\
For example, I know I did tasks from group A on 7/22 around 9pm, and tasks from group B on 7/23 around 5am (as recorded by the timestamps CrowdSource displays for them), and none of either before or since then so far. I used this to try testing all combinations of dates around them. If it weren't buggy as hell, at least one of these combinations should've resulted in seeing only A and not B, or only B and not A.
?after=2015-07-21&before=2015-07-24 includes both A and B.
?after=2015-07-22&before=2015-07-24 includes both A and B.
?after=2015-07-23&before=2015-07-24 includes both A and B.
?after=2015-07-24&before=2015-07-24 includes neither.
?after=2015-07-21&before=2015-07-23 includes neither.
?after=2015-07-22&before=2015-07-23 includes neither.
?after=2015-07-23&before=2015-07-23 includes neither.
?after=2015-07-21 includes both A and B.
?after=2015-07-22 includes both A and B.
?after=2015-07-23 includes both A and B.
?after=2015-07-24 includes neither.
?before=2015-07-24 includes both A and B.
?before=2015-07-23 includes neither.

Sooo yeah. :tsk: That feature's pretty worthless right now.​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Some more info I originally put together in July 2015 about how the writing work gets edited; their system has changed a bit since then, with writing that fails editing getting a chance at being reworked for a half-credit towards maintaining one's writer level.


For those who do the CS writing work, consolidating insight about how their evaluation system works, from experienced workers' posts on the CS forum within the last month:

glitter:

" https://work.crowdsource.com/profile/index This link goes to your main CS profile. It has the statistics for every type of task you do. You can see if you have any rejections because it will show up in your approvals and approval rate. If your approval rate is not 100%, then something was rejected at some point.

https://work.crowdsource.com/career/writing/digest/ This link goes to your tracker for the 100- and 200-word answers. It only shows the status of your last 30 submitted Q&A tasks. If you scroll down, you can see which tasks failed (if any) and get the attached feedback. Also, ignore the 'Pending: 0' in the chart because it says 0 for everyone.

You receive passes or fails based on the number of radio buttons clicked on the Edit checklist.

You receive rejections if your answer falls into any category on the Flag checklist.

Pass: Editor marks two or fewer radio buttons on the Edit checklist. You get paid and your tracker rating (second link) stays the same.

Fail: Editor marks three or more radio buttons on Edit checklist. You get paid, but your tracker rating drops. If your rating drops, you end up getting paid less for the same work. Drop too much and you lose access to the tasks.

Rejection: Editor marked one or more radio buttons on the Flag checklist and a CS admin approved the flag (all flags need admin approval). Your work was rejected, you were not paid, your tracker rating drops, and your approvals (first link) are affected.

When you first start working at CS, there is a short grace period when rejected tasks still pay. However, if you have too many rejections, you won't get paid for them and you may have your work access restricted. "​

Leonard_Telford:

" Fail: Affects rating percentage; gets paid.
Rejection: Affects rating percentage; may not get paid.

You'll notice that sometimes, rejections get paid, particularly if the admin believes the article was done in good faith. However, as you complete more, that benefit of the doubt diminishes. "​

Dessie:

" I might add that the "fail" feature applies only to the 100/200-word writing tasks. Rejections can happen for almost any assignment. "​

glitter:

" To give you some perspective from the editing side...

We see anonymous blocks of text. There's no names, dates, identification, anything attached to the answer. We have no way of knowing if the answer is the writer's first-ever submission or if the person has been submitting tasks for years. However, every writer is expected to follow all of the rules, so every answer is held to the same standards.

Editors have an edit checklist and a flag checklist. Editors either edit or flag, not both. Editors are told by CrowdSource to mark the corresponding box for each error present in the answer. This comes out as automated feedback to the writer, such as "Content contained punctuation errors" or "Content contained style guide errors not already covered by other items on the editing checklist." Editors then have to add feedback to clarify why they checked the box. ...

If an editor marks even one item on the flag checklist, it's grounds for a rejection. This means the editor doesn't have to make any corrections to the answer because it's not usable. A bad source is one of the quickest ways to get a flag on the Q&A tasks, though failing to follow the question type guidelines is also up there.

There is no way to revise your article after submission unless an editor/admin sends the work back to you for revision. Because of this, it's really, really important to read all of the instructions before submitting work. IMO memorizing the task-specific instructions is more important than learning the style guide. Style-guide errors may get you a few dings and fails, but messing up the task-specific instructions often leads to a rejection.

When you first start out, you are given a grace period when you're still paid for rejected tasks, so you have some room to improve without losing access to the writing tasks.

You can see the status of your 30 past writing tasks here: https://work.crowdsource.com/career/writing/digest/

You can also look at all your past work here: https://work.crowdsource.com/history
... "​

glitter:

" The three-ding fail system is definitely broken, but it's on CS to fix that. Since the editing qualification tests are up constantly now, there are always new editors coming into the queues and they may not know that three dings equals a fail.

Besides that, the editing instructions specifically say to mark the radio button for each type of error present in the answer, and CS has said that editors can lose their quals if they fail to mark for the errors.

The system is problematic by design, but the editors aren't the ones who designed it and we're not the ones with the power to fix it. People have had the same complaints since the tier system came out, but CS hasn't given any indications that anything will change. "​

Insolentius:

" If you get paid for a submission that received "We are unable to use this HIT" feedback, that means that it was flagged, but a moderator thought the article was fine. Due to a glitch, the original feedback (the reason why the editor flagged the submission) remains. That's how one of the mods explained the situation to me a while back when I had a similar problem. "​

wtogashi:

" A fail is not the same as a rejection. A fail means you get paid for the work, but it is unusable due to certain errors in your writing. This affects your writing level and your pass percentage, but does not count as a rejection. You do not get paid if your article is rejected. "​

calmehi:

" Your rating is based on your last 30 so if you push all the fails and rejections out you'll be back at level IV. It's harder to move up but slightly easier to stay there. You need 97% to get to level IV but only 92% approval rate to stay there. "​

glitter:

" A CS admin is the only one who can reject work. An editor can't reject something outright. Instead, the editor flags and a CS admin reviews the flag.

If the flag is valid, the editor gets paid, the answer is rejected, and the writer doesn't get paid. If the flag is not valid, the editor does not get paid, the answer is accepted, and the writer gets paid. "​

Dessie:

[how a writer can leave a note for an editor:] " Be sure you have the minimum word count before you leave the note at the bottom. Messages left in the feedback box go to the Crowdsource admins, not the editor, so write your message to the editor directly in the task box. "​

janns:

" Editors have a checklist consisting of 12 radio buttons to check off on every task. These range from subject-verb agreement errors and use of the inverted pyramid structure to punctuation and capitalization errors. The final radio button is for CrowdSource style guide errors, which is a catch-all for everything else.

If the editor checks three of those buttons (or, as people say here on the forum, "dings" you three times), your article receives a fail. You still get paid for it, but it affects your writing level *for the Q&A tasks only* -- If you receive three or more fails within 30 tasks, you drop down a level and your pay rate is decreased. If you receive three or more fails within 30 tasks while you're at Level I, you're off the project.

The editors also have a 7-point checklist for flagging an article. Articles are flagged for things like failing to answer the question, providing inaccurate information, inserting personal opinion, plagiarism, and, especially, using unapproved resources. Sadly, many well-written articles get flagged for violating resource rules. Editors can also flag for "multiple instruction violations and/or poor quality writing" -- this encompasses anything from 400-word articles to articles that contain so much fluff and filler that they can't meet the required word count after editing to articles written by people who haven't mastered the fundamental rules of the English language.

If your article is flagged, it goes to CrowdSource for review. CrowdSource decides whether to uphold the flag or overturn it. If they uphold it, it also shows up as a fail and counts toward dropping you down a level, and you don't get paid for it. If they overturn it, I think the editor may not get paid (not sure, since my flags don't tend to get overturned). In any event, the editor doesn't get paid for her work until the article is reviewed. If you think an article was flagged in error, you can appeal the flag by writing directly to CrowdSource. "​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
CrowdSource has a modest amount of audio/video transcription work, which requires a qual test (and there are complaints that the current version of the transcription test is too writing-oriented, doesn't focus on transcription).

Someone recently asked on the CS forum why the promoted transcription tasks show up in WS as paying $0.00, like this:



explanation from CaroleTobias:

"The transcriptions tasks always show zero on the main page because they vary in pay depending on the length of the recordings and what its requirements are. Once you click into them [after you're qualified to be able to access them], you can see how much each one pays and select the one you want to do."​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Info about the 'Task Training' feature added in August 2015:


They posted some more info about the 'task training' thing here: http://forum.crowdsource.com/discussion/2515/new-feature-task-training

It will be a replacement for the 'beginner limits' that cut people off after a small number of HITs on a certain task type until you've had x amount of that type approved (which should appropriately be called caps/limits, but which some CS forum users have an unfortunate water-muddying habit of calling "soft blocks").

"Task Training allows freelancers to experience new tasks gradually and with increased feedback from trusted reviewers. When you are in Task Training mode, you will receive feedback on each task you submit. Once you hit the passing threshold (project specific), you will graduate and have access to the full project.
You will see a small icon [of a graduation cap] by the task indicating that you are eligible to participate in the Task Training for that project. We are launching this feature gradually, so you will not see it on all tasks."​

It won't actually be visible to any of us yet because
"The feature Task Training has been released to our software, meaning the functionality now exists for our internal project managers to use it on their projects. The process of activating Task Training on projects is a separate initiative from the software release and will happen gradually over the course of the next several weeks, some as early as Monday."​


Some more info from August 2015:

Someone on the CS forum discovered another unannounced aspect of last week's software upgrades - a task type on their Find Work list became greyed-out and had an 'Awaiting Review' label:
Log in or register now. to view Spoiler content!

The explanation from CS_Julia:

" The "Awaiting Review" experience only applies in two scenarios:
1) When Task Training is enabled and as a trainee, you have reached your limit of tasks that you can do before getting feedback.
2) If a limit is in place for Beginners a project or projects and you are a "Beginner" in that project and have reached the limit of tasks you can do without receiving feedback.

We do not currently have any projects that are using Task Training at the moment (we will start activating it on several projects this week). Consequently, what you are seeing is the result of some of our existing projects that are set up with Beginner limits. Our goal over the next several weeks is to migrate projects that are currently using our legacy concept of "Beginner Limits" to the improved experience of Task Training.

You will not see this "Awaiting Review" experience if you have been active in a project for some time and are no longer a Beginner. It only limits freelancers who are new to projects, in an effort to ensure all of the instructions and guidelines are clear before gaining unfettered access to the project. "​
 
Last edited:

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
I don't know what this number consists of - all workers ever registered on CrowdSource/OneSpace, workers with a verified Paypal account linked up, workers who did at least x tasks within x time, workers with x qual, etc - but CS_Elise mentioned on 8/13/15 that "the entire workforce" is "about 3500 people" at that time.
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Client list I gathered in August 2015:


CrowdSource's client list as displayed at http://www.crowdsource.com/about-us/clients/ , as of 8/17/2015:
  • Staples
    • 'Write: # Word Product Description'
  • Ebay
    • 'Write: #-Word Buying Guide'
  • Target
  • Orbitz
  • Overstock.com
    • 'Write: # Word Product Description'
  • MLB (Major League Baseball)
  • CNet
  • Toys-R-Us
  • Acxiom
  • Coca-Cola
  • Bed Bath & Beyond
  • Citysearch
  • Dictionary.com
  • Beyond.com
  • Grainger
  • RunningShoes.com
  • Shutterfly
  • PowerReviews
    • 'Classify: A Product Question (PR)'
  • Looksmart
  • Kelley Blue Book
  • Groupon
  • ELocal Listing
  • BazaarVoice
    • 'Classify: A Product Question (BV)'
  • Elle Canada
  • Debate.org
  • Koozoo
  • Gigwalk
  • Kelly-Moore Paints

Some of those clients were from previous projects no longer providing work (from my limited knowledge of CS's project history, I have no idea what kind of tasks most of those clients had). And some from current/recent projects aren't listed...
  • I think Ask.com is who the majority of q&a tasks are for.
  • Thredup.com previously was 'Tag: Images of Clothing' and 'Review: Images of Clothing'.
  • Lowes had the color swatch HITs I enjoyed for their brief appearance in 2014, and a couple other types I don't recall.
  • Office Depot was/is matching product numbers from the OfficeMax merger (Staples acquiring Office Depot still isn't a done deal / sure thing yet).
  • Invisible Boyfriend/Girlfriend is who 'Write: Text Messages in 160-characters or Less' is for.
  • Hayneedle.com previously had product descriptions.
  • GreaterGood has 'Viral Video Topic Briefs' (new in Aug 2015, in the 'Research' category but requiring 'Junior Copy Writer' qual from what I can see).
  • OwnerIQ.net's site ManualsOnline.com - a monthly series of jobs that goes in this order:
    1. Review: Is PDF a User Manual
    2. Find: Product Manufacturer and Device Type
    3. Review: Product Manufacturer and Device Type
    4. Select: Product Manufacturer and Device Type
    5. Find: Model Name and Number
    6. Review: Model Name and Number
    7. Select: Model Name and Number


CrowdSource's afaik-modest amount of audio transcription tasks ('Transcribe: Audio or Video Recording', and a hard-to-catch pre-transcription moderation task) comes via their Transcribe.com site; its 'featured clients' list at https://www.transcribe.com/our-team/featured-clients/ as of 8/17/2015:
  • Microsoft
  • Carnegie Council
  • Penn State University
  • Adobe
  • Texas State University
  • Vinted
  • Expert Virtual Agent
  • Lake Champlain Basin Program
  • Evertrue
  • Orbitz
  • BazaarVoice
  • Bed Bath & Beyond
  • PowerReviews
  • Cooking.com
  • University of Texas at Austin
  • Bowling Green State University
  • SnoWest
  • Walden University
  • SuperFeet

And their writing/editing tasks come via their Write.com site, which at http://www.write.com/featured-clients/ as of 8/17/2015 is mostly just repeating the clients seen on the first list from crowdsource.com; these are the different ones I didn't see in the first list:
  • Klip
  • SkyMall
  • BuyBuy Baby
  • FindLaw
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Since August 2015, when you initially sign up on the OneSpace site, you won't see any work available to you until you pass at least one of their qual tests (misleadingly labeled 'courses' in the interface). Even the SEO tasks (misleadingly labeled 'surveys') that used to be open to everyone are now mostly restricted to those who've passed the QA Specialist test. (And even once you do pass one or more of the tests, a lot of the work is increasingly further restricted to those who've managed to do a certain amount of it before it went closed-invisible-qual.) Unlike on MTurk, with the exception of a few task types they might be currently 'promoting', you can't see most of the work you don't currently qualify for.

Announcement from CS_Elise (mistakenly calling it "Keywords: Search" instead of "Search: Keywords"):

" Beginning tomorrow, 8/21 (9:00 AM CST) the “Keyword: Search” tasks will be changing their workforce. You will be required to have the Quality Assurance qualification. At this time, it’s open to anyone with a WorkStation account. This charge was brought on due to the drop in quality from certain freelancers (not everyone!) entering nonsensical information. We will also be rolling out additional automated quality measures tomorrow to ensure the tasks are completed as instructed. "​
 
Last edited:

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Originally gathered in Sep 2015:

CS's blog has had a few behind-the-scenes-y posts about the company in the last couple months:

CrowdSource Hits Major Milestone: 100 Millionth Task Completed
Jul 22, 2015

" We are celebrating here at CrowdSource and we have our stellar workforce to thank for it. This week, we crossed the threshold of 100 MILLION tasks completed in WorkStation! Our freelancers have been busier than ever, writing, editing, categorizing, moderating and transcribing their hearts out, and this milestone is a testament to all of their hard work.

To celebrate this momentous occasion, we surprised one lucky freelancer (who has chosen to remain anonymous) with a $1,000 bonus! This anonymous individual had no idea she had completed the 100 millionth task when she submitted it. Imagine her delight when she found out what she won! ..."​
(^ Lucky! :faint: )​

What Is it Like to Work at CrowdSource?
Aug 5, 2015

" ...
How was CrowdSource founded?
It’s an interesting story, actually. While operating an online publishing business [Midwest Internet], our founders developed a need for creating engaging, scalable web content. Rather than hiring 150+ team members, our founders decided to look to outsourcing. They used some of the common freelancer platforms on the market, but quickly found managing freelancers at scale was a full-time job in itself. So, they decided to create software to do it for them. As the platform evolved, it became evident this software could be extremely useful to other businesses. That’s when CrowdSource was spun off as its own business entity.

The Culture at CrowdSource
Our leadership team values creativity, collaboration and outside-the-box thinking. Walk through our office and you’ll notice these themes integrated throughout. From the open floor plan to the fully stocked lounge, our “work hard, play hard” attitude inspires open ideation. Everything about our space has been carefully designed with our team members in mind. Much like our approach to software development, usability is our first priority.

Perks of Working at CrowdSource
There are a lot. Our team enjoys an on-site gym and free workout sessions with a personal trainer twice per week, free haircuts, oil changes, and a casual dress code. Lunch is catered every Friday, and on top of that, we take “TGIF” to a new level by closing our office at 2:30 every other Friday. Why? Because we empower our team to focus on accomplishing goals, not working crazy hours.
... "​

What Does a Content Project Coordinator Do?
Aug 18, 2015

" CrowdSource is well-known for its awesome perks and unique work environment (massages and free beer, anyone?). But have you ever found yourself wondering what the people who work there actually do for a living? ...
In addition to the thousands of freelancers who complete work on our platform remotely, CrowdSource is comprised of roughly 50 in-house team members whose duties range from software development to sales to project management.
One of the key positions at CrowdSource is the Content Project Coordinator. This person is responsible for overseeing a particular content project from start to finish. Once a client has signed a contract with CrowdSource, it’s up to the Coordinator to make sure the contract is fulfilled. ...

Understanding Client Needs

The first step to managing a successful content project is understanding a client’s goals and expectations. Online publishers and retailers have a variety of different content needs, including product descriptions and reviews, fact-based Q&As, long-form articles, creative blog posts and more.
A Content Project Coordinator works closely with an Account Manager to make sure CrowdSource has a solid understanding of a project’s final objective and identifies a clear, step-by-step process to meet that objective.

Providing Quality Content

In today’s world of online publishing and search engine optimization, only one thing’s for certain: quality content is king. And if there’s one thing CrowdSource prides itself on, it’s creating high-quality content. A Content Project Coordinator is the person who’s responsible for ensuring our freelancers are creating the best content possible.
This involves setting up projects in WorkStation, the online portal where our freelancers log in to complete their work, and training freelancers via instructions and webinars. It also requires regular review of freelancers’ work, as well as providing helpful, constructive written feedback.

Hitting Goals

In addition to creating content that is going to add real value to their websites, clients also have clear-cut goals on how much content they need to produce. It is not unusual for a client to require several thousand pieces of content in a single week.
A crucial aspect of being a Content Project Coordinator is ensuring that we meet these production goals while maintaining quality. If a project is at risk of missing a goal, it is the Project Coordinator’s duty to get creative and come up with ways to make sure we hit it (like a special competition or bonus program for freelancers).

Doing What You Love

... When it came time to make a decision about college, it was a virtual no-brainer: I would study English Lit. Inevitably, over the course of those next four years, I would be relentlessly asked: “Oh, interesting. And what are you going to do with that degree?”
Never in a million years would I have imagined the answer that I’m able to give today: I’m responsible for creating high-quality, engaging content for some of the world’s largest online publishers and retailers. ..."​

What Does a Data Project Coordinator Do?
Aug 19, 2015

" Some people see the world through jigsaw-shaped lenses. They are quick to identify the pieces of a puzzle, and get a kick out of analyzing those parts to find the most efficient – and accurate – solution. That’s us. At our core, we are problem solvers. On a larger scale, we use big data to solve complex problems for some of the world’s biggest companies. Sound awesome? It is. Welcome to the Operations Team.

The coolest part about the work we do is that we are hands-on through every step of the project, from start to finish. Our Data Project Coordinators work directly with key stakeholders to develop a strategic plan for meeting client objectives. (Cue: jigsaw spectacles). They are then responsible for project implementation, execution and evaluation. As a Data Project Coordinator, you own your projects, and will (and should!) take pride in what you have accomplished.

Every day is different at CrowdSource; if you walked in tomorrow, you might find us talking through a way to streamline the project instructions for our freelancers, or brainstorming with our Account Managers on a client’s priority shift. ... We keep a quick pace around here. The daily ebb and flow (emphasis on the flow, not a whole lot of ebb) requires us to remain flexible but focused.

Our team reflects the work we do – we fit together well. We have identified our moving parts and put them together to leverage a brilliant mix of personalities and skill sets. Number crunchers, data geeks, math whizzes – we wear those titles proudly. We’re all wired with a keen eye for detail, and that keeps us on track. We secretly really like each other too. It’s not uncommon to find us all together outside of the office – game nights, happy hours ..."​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
The Nov 16, 2015 announcement of the name change from CrowdSource to OneSpace:

Surprise! Email a few minutes ago:

" We’re excited to announce that on Nov. 17, 2015, CrowdSource will unveil our new identity: OneSpace.

As we expand to provide a more robust set of solutions to both clients and freelancers, we feel it’s important for our brand to reflect our commitment to providing a single, simple-to-use platform where businesses and freelancers can collaborate.

Here’s what our transition to OneSpace means for you:
•WorkStation will now be called OneSpace, and you’ll access it from work.onespace.com.
•Once you log in, you’ll see our new logo, as well as our updated color scheme, featured throughout the platform.
•A system outage will occur on Nov. 17 to facilitate the transition from the old platform to the new one. The outage will begin at 11 a.m. CST and last for approximately one hour.

More changes are on the horizon. Look for announcements in 2016 heralding exciting improvements to OneSpace and the opportunities available to you as the biggest brands on the Internet launch their work on OneSpace.

Thanks for all you’ve done to make CrowdSource a success! We’re confident you’ll like what OneSpace has in store for you over the next year and into the future. "​

Some time after it's back up (might be a day or two), G @kadauchi and I will have to update our CS WS scripts, for the small handful who use them. [done]


Edit: This 'OneSpace' name is the very-belatedly-decided winner of the $5000-prize domain name contest from back in January 2015, 10 months ago.

And this is looking to be a lot more than just a name/logo change, with some potentially concerning new features and approaches. :\ A couple clarification posts from CS_Julia on their forum in response to some of people's concerns:
Log in or register now. to view Spoiler content!
 
Last edited:

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Announced Dec 15, 2015 - it's now Jan 17, 2016, and I think the CW test is still down...

Now nobody else will be able to get in on those currently who isn't already... pretty much all their writing work requires passing both their 'Junior Copy Writer' test and then their 'Copy Writer' test... and (note that they unfortunately call their tests 'courses' since summer 2015 despite the absence of any actual training, to the confusion and consternation of many) CS/OS employee OS_Sam just posted today 12/15 that "The Copy Writer course will be down for maintenance until the beginning of January."​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
OS sent out a "2015 OneSpace Year in Review!" email before Christmas in Dec 2015:

" We've reached the end of 2015 and are putting the finishing touches on our 2016 plans. There are a lot of new and exciting things coming your way in 2016, but before we hit the ground running in the New Year, here is a quick look into everything OneSpace accomplished with your help in 2015. Thank you for all of your contributions over the last year, and here's to an even more successful 2016!

OneSpace Year in Review

Completed Tasks: 28,186,340
OneSpace freelancers completed over 28 million tasks in 2015. That means that a task was submitted every ~1.2 seconds!

Unique Freelancers: 58,978
The 58,978 freelancers that completed tasks for OneSpace in 2015 would fill Chicago's Soldier Field.

Full-Time Equivalent: 984
If the work performed by OneSpace freelancers in 2015 had been assigned to internal resources, it would have required 984 full-time employees to deliver the same results.

Years of Work: 885
The cumulative time OneSpace freelancers devoted to client projects in 2015 adds up to 885 years of production. "

(I think these numbers would include the work that went through MTurk before CS/OS shut that down in summer 2015, in addition to what ran through their direct site throughout the year. Will be interesting to see what the numbers look like next year.)

That's an average of 478 tasks per worker over the year. (Some are quick 'microtasks' that just take a few seconds, some are complex writing tasks that could take hours, and some anywhere in between.)

The typical definition of a full-time equivalent employee (FTE) is 2,080 hours per year (40 hours a week x 52 weeks = 2080). So the claim of 984 FTEs may represent 2,046,720 combined hours of work over the year, per CS/OS's hidden-from-us-and-impossible-to-be-entirely-accurate attempts to track workers' time spent in tasks. (But it's unclear exactly what they meant by the FTEs item vs the 'years of work' item.)

If so, that's an average of 34.7 hours per worker over the year, and 0.73 hours (4.35 minutes) per task.

PS: And of course they did this over a week too soon; although most of the OneSpace employees are apparently on vacation Dec 24-Jan 4, some tasks have continued to be posted and submitted during that time, so the year wasn't quite over yet.
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Jan 11, 2016:


Big uproar on the OS forum today, Jan 11, 2016 - the high-volume 100/200-word Q&A writing project for Ask.com is coming to an end this month. Starting this evening, the remaining work in it is restricted to those who've earned a high enough level from their previous work on it, and they're estimating it will run out entirely in about 2 weeks. This likely also means the closed-workforce lead-in projects "Classify: A Question" and the ones to write and edit the questions (before writing and editing the answers) are gone too.
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
On Jan 20, 2016, OS forum user 'SleaZow' posted a clarification on why some workers have gotten another chance after failing an OS qual test 3 times, but other workers haven't:

" I can confirm that every 30 days there are people who take a look at some profiles which get "another chance" - you just basically need to be online every now and then, and just be active.. Hopefully you get chosen to get another chance. Got this information from e-mail from support :) so i guess it is pretty trustworthy :) "​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Announced Dec 15, 2015 - it's now Jan 17, 2016, and I think the CW test is still down...

Now nobody else will be able to get in on those currently who isn't already... pretty much all their writing work requires passing both their 'Junior Copy Writer' test and then their 'Copy Writer' test... and (note that they unfortunately call their tests 'courses' since summer 2015 despite the absence of any actual training, to the confusion and consternation of many) CS/OS employee OS_Sam just posted today 12/15 that "The Copy Writer course will be down for maintenance until the beginning of January."​
The 'Copy Writer' test was made available again on Jan 26.
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
Brief mention in an article about former NFL player Jackie Smith, pointed out on the OS forum today:
Jackie Smith talks extensively about the drop that almost ruined his life | SI.com - Wed Jan. 20, 2016

" That’s why he was on that Segway, visiting his son Greg, 41, at the office where the younger Smith is Vice-President of Sales for OneSpace, a company that helps place freelance writers. The Segways—like the old school telephone booth, the working scoreboard, the Nascar car and the full-size replicas of the Simpsons and the Millennium Falcon—are superfluous symbols of success; they’re here just. . . because. "​
 

clickhappier

┬──┬ ノ( ゜-゜ノ)
Subforum Curator
Crowd Pleaser
Joined
Jan 12, 2016
Messages
728
Reaction score
1,634
Points
593
Location
USA
OS is going to be significantly revamping their work site some time next month (March 2016). Our handful of interface-improving scripts will all either become obsolete (no longer needed) or have to be extensively updated when that happens. They apparently picked some favorite workers to 'beta test' the new design, but I'm not one of 'em. See the OneSpace forum thread for what little is known so far, which may change.