You definitely can get your content from syndicated sources to outrank the original. It happens all the time. Essentially, whoever does the most effective promotion to their page will rank the page.
Keep in mind that since there are a number of duplicates, you will often get only one page of that duplicate content into a particular SERP. This is due to the duplicate content filter. All but one page is filtered, the page that wins ranks and the first runner up is filtered.
So, not only must you outrank all other content for your targeted keyword, but you must also outrank all of the duplicates, else your page won’t show in SERP at all.
There is no need to send out a memo to those “sites to quit messing up the search results”.
I just checked and the Duplicate Content Filter seems to be working just fine on those SERPs.
Google, like all other popular search engines, returns a maximum of 1000 results per query. So you will never be able to find a listing for a page at position 1001.
Google employes duplicate content filters in several ways.
1. They apply a filter within their crawler mapping code. After a large number of pages with the same content has been indexed they will stop indexing new pages. This prevents web crawlers from wasting valuable resources.
2. When there is a large number of pages of nearly identical content they filter the main index and move additional pages into the supplemental index. This again saves resources by removing duplicate content from the main index.
3. The actual SERP will filter identical content, whenever there are other relevant pages available, to provide diversity in results for users.
So, no worries, Google duplicate content filter is working, however, if you do ever come across an example where it doesn’t seem to be working then please let me know, we can co-author a memo to Google to let them know that their duplicate content filter isn’t working.
Here is what Google has to say about its duplicate content filter:
And a little on the silly side: