Forum Moderators: Robert Charlton & goodroi
We run about 200 sites across Europe from tiny to large and I can see no discernable differnce on any of them
I must agree with you. My results are pretty stable.
There has been a lot of negativity during the night, and I just cannot see that much of a change anywhere for moste of my keywords. Where I see changes, we are in fact moving up.
Most of our SEOing has been on-site techniques, combined with interlinking of pages. In fact, I's say every single page on the 10 sites I co-own and work with has at least one link to another page (not necesarily the main page) on another of the 10 sites. Some pages will have reciprocal links to another language version on another domain.
This has been warned against, but since it actually makes sence to our visitors we've kept doing it, and it works out great.
We link pretty freely and quite generously to other sites as well, but we have never done any organized link exchanges with sites we don't own.
The oldest sites are from 1998 and the newest where registered last fall. Of course, the newer ones aren't doing as well as the older, but they are picking up in serps.
That offered, I would very much like to hear more about the sites that have been dropped.
What techniques have you been using?
Are you dropped for all and every possible search?
Are you dropped on these IP DCs only or also on google.com? Are you just as invisible on other google domains?
I'm really curious.
Why?
Because it appears all you need is two PR 5 sites cross linked on about every page to each other.
One of the new entrants in my area has 613 backlinks and 95% are internal or from it's big brother.
Stuffing keywords in the domain or page filename doesn't seem to hurt either!
Big step back Google.
Could it be that those people who weren't hurt have huge sites and numbers of internal backlinks?
Could it be that those people who weren't hurt have huge sites and numbers of internal backlinks?
As I said above. That's what we have been doing for years, and it has always worked for us. I wouldn't say we are huge though, but the larger sites have 5,000 to 10,000 pages.
And the back side of this would obviously be that this is exactly the same technique used by the pseudo directory sites scraping bits of content from other sites and presenting it as a page only to get AdSense clicks.
Looksmart and altavista.looksmart.com are the biggest of them all.
For a keyword combination where a page from my site used to show up 5th, its not there in the first 100. In fact, a sub-page from my site hijacked by looksmart shows up 2nd (I mean the hijacker's site with my page shows 2nd) and its not even completely related. For eg: if you were to search for the keyword "fruit", its the page for an "orange".
I also tried putting the entire title of that page and searching, but that still showed up the hijacker's site 5 positions above mine.
Seems like Google is doing this major update to clean up the hijacks and its getting all messed up for now. Maybe in a weeks time, everything will be perfect. Keeping my fingers crossed.
Long ago I stupidly set the 404 error redirect back to my home page... In early january I added lots of pages and apparently did some internal link typos. So, yesterday i discovered the serps showed 7 copies of my home page - 6 going to invalid urls.
Do you think I could likely have been penalized for dup content?
I'm searching for anything that would make me think this dramatic drop is temporary.
I have a big shopping portal with thousands of product sites.
Iīm watching the following since the update yesterday:
The positions have not worsened but "on the same position" Google
shows less relevant sites at the first und second result of my domain.
BUT I donīt lost any positions (minimal changes)!
Bute the sites, which Google shows, are so irrelevant, they might never
appear at this place.
Also the query
keyword site: mydomain.com
shows first only less relevant sites at the first positions and then the
relevant sites of my domain.
Does anybody have similar problems?
Is there a declaration about this phenomenon?
Here is what I am seeing for a particular set of keywords that I have been watching:
There are 2 distinct sets of serps showing on the different datacenters. Both of these new sets are different than the ones showing prior to the update.
One set of the new serps=bad, one set=good (in my opinion).
I have just noticed, while looking at the good serps, if I add &filter=0 to the url, I then get the EXACT result that I call the bad serps.
I'm not sure what this means, if anything, but I am hoping that the bad serps I am seeing with the update are due to some filter that has yet to be applied?