Welcome to WebmasterWorld Guest from 18.232.124.77

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Waiting on new site... how long to move in the Google SERPS?

     
10:49 pm on Nov 27, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


Quick facts. I put up a brand new domain August 9th. Wordpress platform, site map. Google has only "indexed" 25% of my images and 35% of my articles. No SERP traffic.

I continue to read here about a Sandbox-like effect for new sites (I know, there is no such thing as a sandbox, maybe a bandsox).

A post from May this year had one member saying his new domain finally started seeing some Google traffic, it looks to be permanent.

What I did not see from any of the replies on the post was a time frame. My new domain site has been up for four months. I have decent links about 200, none exact match anchor text, most naked urls, branded with site name, or innoculous words related to content...so no problem with Penguin.

Can I please get some time frames from members with new sites?

I'd more than happy to put the site away for a year if I knew nothing was going to happen for a year. No sense wasting my time etc.
11:56 pm on Nov 27, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15755
votes: 828


Google has only "indexed" ... 35% of my articles
I am inclined to say that this has more to do with GSC reporting than with de facto indexing. Do some exact-text searches on assorted pages and see if they come up. (To ensure that I am not talking out of my hat, I tried it with my three most recently posted pages--the newest being 9 days ago--and verified that they are all in the index.) If some pages don't come up at all, you will need to verify that they have all been duly crawled. In fact, you should have seen a full crawl within 24 hours after Google first learned of the new site's existence.

Now, if the problem is that everything is in the index, but nothing on your site floats to the top when you search for “cats” or “jazz” ... well, that’s an entirely different problem.
12:51 am on Nov 28, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 2662
votes: 794


Something doesn't add up?
No SERP traffic

yet
I have decent links about 200,

How did you acquire 200 links if you have no SERP traffic?

Are you buying traffic?
2:48 am on Nov 28, 2017 (gmt 0)

New User

joined:Oct 11, 2017
posts:13
votes: 9


It took my latest new site 3 months to get out of the proverbial sandbox. I did a press release, manual relevant blog comments, and 4 guest posts. Unless most of those links you received were from a press release, I would imagine you caused your site to be in the sandbox longer with too high of a link velocity.

*edited to clarify what type of blog comments.
11:29 am on Nov 28, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3229
votes: 496


Brand new site launched with about 200+ pages on 15th October, all pages indexed and ranking within a couple of weeks, subsequently another 200 pages uploaded.

In early November Google stopped indexing the new pages at about 350, today it is 359 so hopefully it's started indexing again since I have a load more pages to construct and I ain't going to if Google's not bothering to index them.

Images are important to me, still less than 25% of my images have been index, however with Google's image theft I haven't relied on that for traffic for 3+ years now.

I never go for links and never buy traffic.
11:55 am on Nov 28, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 593
votes: 79


You never mentioned a sitemap. Submit a sitemap to Google. It's the best way to tell them about your pages.
1:20 pm on Nov 28, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3229
votes: 496


You never mentioned a sitemap


That's an interesting question. In 24 years I have never produced a sitemap since I have always ensured that my navigation is my sitemap.

Certainly I would agree IF all pages and sections are not accessible through the standard navigation and since it is built in WordPress they may not be.
1:41 pm on Nov 28, 2017 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Feb 5, 2004
posts: 593
votes: 79


I like sitemaps especially for new sites as you don't have to wait for Google to spider an entire site (if pages are not all linked from the homepage). They are also handy to let Google know when a page gets updated. Plus for any modern CMS there are plenty of sitemap plugins that automate the whole process.
5:49 pm on Nov 28, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


I've been working with a new site launched almost exactly the same time, how many pages are in the site? I found adding links helped get it crawled deeper and is starting to rank now, fired 1400 backlinks at it. according to google 99% of sitemap indexed and 90% of images.
6:04 pm on Nov 28, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


Lucy,

Something doesn't add up?
No SERP traffic

yet
I have decent links about 200,

How did you acquire 200 links if you have no SERP traffic?

Are you buying traffic?

Nope, never bought traffic in twenty years. That explains my perplexion during the panda penguin purge. Never spammed ezine or article directories, never bought links, no blog rolls, no footer links...my problem was actually very simple to solve 2 lines of code in disallow

anchortext: (chinese characters for my home page)
domain: (edu with bad software)

of course, you cannot use anchortext: in the disallow text to I had to have a very messy disallow.txt (code is poetry to some people, its diareaha of the page for others:)

here's the problem in a nutshell

1. I got hit by a huge wave of Chinese spam, before even Panda. It never hurt me until Google decided that it hurt me. I'm sure many people have experienced similar situations.

2. I never thought to look at my edu links. One of the edu sites had software set up to get exact match anchor text for links. I received about 100 links from 100 different users of the site. All that exact match anchor text blue away my site for penguin.

By the time I found it and disallowed it, it took two and one-half years of waiting for Google to run the penguin algo. Once they ran it, I saw all my keywords move up in the SERPS for the first time in 6 years.

I only have a moral victory from the experience. I always followed Google guidelines and had to listen to six years of google trolls saying stuff like, that's what U get for spamming and putting up thin crap content.

I never spammed (I got spammed) and of course, I never put up crap content (all original words, pictures and videos) So, I was right and the Google trolls were wrong.

It was a relief to finally get my logic back and accurately read the google algo again. I just took what I learned and leveraged my old site by a factor of fifty for one last run at a good honest site in my niche:)
6:10 pm on Nov 28, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


also, thanks to everyone else with the input.

sitemap: yes, from day one

I can't figure out the logic. Google has that sitemap since day one, they keep revisiting it and still not indexes.

I also realize there is a difference between being in the index and being anywhere in the SERPS.

Also, I remember from over a year ago that GSC is basically worthless. It took seven months for them to report that my site had only 190 indexed pages. I went noindex on another 200 or sao pages pages seven months prior.

During those seven months, GSC also kept putting up old link DB. For example, one day they would post my 13,000 plus Pinterest links, then the next month they would not. Next month there are 7,000 Pinterest links etc.

It was only after they ran the Penguin algo would I actually mark my beliefs to market. Your mileage with GSC might vary.
6:43 pm on Nov 28, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 2662
votes: 794


@itsjustme2 It wasn't Lucy asking those question it was me.
A few things I am not accusing you of anything, spamming or not spamming, producing good or poor quality content. I am simply asking you logical questions based on the information provided. How do you acquire 200 links to website that has not traffic?

There is nothing wrong with buying traffic, in other words using services such as AdWords, it amy well be a necessary step to acquire a user base that will allow for further organic growth.

There is mounting evidence that disavow files are useless. Google themselves discourage webmaster from using them, and Google is pretty good at ignoring links from low quality sources. The consensus around here is that being spammed by some Chinese website should not cause any issues, as it will simply be ignored. Tis was not likely the case when Penguin first rolled out but it seems to be the case today. Disavowing will probably not hurt you but it is unlikely to help you. By the same token Google may be devaluing the links that you have pointing to your website as well, for what ever reason. So depending on links alone is not sufficient to generate traffic.

GSC also kept putting up old link DB. For example, one day they would post my 13,000 plus Pinterest links, then the next month they would not. Next month there are 7,000 Pinterest links etc.


The link report is GSC is known to be inaccurate. It will show you links that are old, follow links no-follow links, the number will change from day to day and so on. I don't understand the purpose of this report given that is inaccurate and unreliable but it is there. My guess is that Google doesn't want webmasters obsessing about links.

Generating good quality and unique content is necessary but it is by no means sufficient. Any one can go out pick a website at random, rewrite the content such that is original show different yet relevant pictures and publish the pages. This would be good quality and unique content, but it would be unlikely to rank. Ranking requires you to produce high value content, and being able to convey that value to a robot.

What does your search analytics report show in GSC? Are you earning impressions for relevant keywords? Are the keywords shown, the ones you expected? What about regionally, where are the impressions coming from?
6:48 pm on Nov 28, 2017 (gmt 0)

Administrator from GB 

WebmasterWorld Administrator engine is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:May 9, 2000
posts:26238
votes: 998


I can't figure out the logic. Google has that sitemap since day one, they keep revisiting it and still not indexes.

My first thought is that the pages on your site map just don't yet have enough "value" from external sources. This often happens with "brand new" domains as you described it.
I have decent links about 200

I'd start working on developing new links to the sub pages of the site.
7:27 pm on Nov 28, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15755
votes: 828


It wasn't Lucy asking those question it was me.
Yes, the questions I asked were more prosaic (and if they were answered, I can't find it): Has the entire site been crawled--preferably all at once, immediately after its discovery--and can specific content be found in the index?

If a search engine can't find all pages without a sitemap, how can humans be expected to do so?
7:37 pm on Nov 28, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


Red Bar: Brand new site launched with about 200+ pages on 15th October, all pages indexed and ranking within a couple of weeks, subsequently another 200 pages uploaded.

In early November Google stopped indexing the new pages at about 350, today it is 359 so hopefully it's started indexing again since I have a load more pages to construct and I ain't going to if Google's not bothering to index them.

Thanks...I'm just trying to think through the logic. Here's what I think is a reasonable and logical hypothesis.

Your site was comparatively small, so Googlebot was able to spider it two or three times

My site, 10,000 pages on WordPress, too large for Googlebot to even spider completely once. Plus, Wordpress filled with so many bugs. For example, Googlebot insists on trying to find urls with domain/tag/pagename or domain/author/pagename etc.

They have found 10,000 pages that never existed and never intended to exist and there's nothing I can do short of rewriting all of Wordpress code myself.

My site is 10,000 pages (I'm a wizard of concantenation) and could visualize a big picture site. Most of the pages 60% are noindex. I needed them up at the beginning of the site because it ties together as a complete package.

Noindexing pages is easy, just put a new column on a spreadsheet that translates into noindex.

Logically that ought not have been a problem because I always had the sitemap to say, these are the only pages and posts on my site with index meta tags.

So, I do think that's a logical hypothesis explaining the delay in my site indexing. In the language of formal logic, it's a valid argument. I don't know if it's a sound argument until I mark my belief to market.

Finally...someone inquired into how I got two hundred links. I mentioned that after I saw my old domain recover I designed and built a new domain based on the old domain leveraged up by a factor of fifty or so.

Some of those links are Redirect 301 from my old site. I'm not sure if Google even found all of them yet because I only started the Redirect 301 route slowly about five weeks ago.

Those links are a representative sample of all the good links I received freely from other bloggers, writers and sites over the course of 15 years. If it works, I will continue with the Redirect 301.

Right now I have no basis for making a good decision because Google does not provide me with the tools or information I need to make a good decision.
8:00 pm on Nov 28, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 1, 2016
posts: 2662
votes: 794


My site, 10,000 pages on WordPress, too large for Googlebot to even spider completely once.

I have a site that has millions of pages and Google crawls more than 10k pages a day on average. So this is by no means too much for Google to crawl.

They have found 10,000 pages that never existed

and
Most of the pages 60% are noindex. I needed them up at the beginning of the site because it ties together as a complete package.


I would be concerned that there are technical issues that are making it difficult for Googlebot to crawl the site. It may be that Googlebot goes and grabs a bunch of pages and then most pages either don't exist or are "no indexed", so something like 60 to 80% of your crawl budget is wasted.

Wordpress is great for small sites, but it has its limitations. You may be facing one of those limitations, in which case you will not have choice.

Those links are a representative sample of all the good links I received freely from other bloggers, writers and sites over the course of 15 years. If it works, I will continue with the Redirect 301.


Also, you initially said this was a new site. Do I understand correctly this is new domain with content and links being transfered from a site that was previously penalized?
8:00 pm on Nov 28, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


They have found 10,000 pages that never existed and never intended to exist and there's nothing I can do short of rewriting all of Wordpress code myself.


Just install Yoast and then configure it
10:28 pm on Nov 28, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


I think I'm wasting my breath on this post.

Wordpress is great for small sites, but it has its limitations.


Yeah right!

11:44 pm on Nov 28, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15755
votes: 828


At the risk of fatally derailing the thread:
“X is great for Y but it has its limitations” is not really a statement that should produce strong reactions, whether pro or con, for any given values of X and Y.
12:44 am on Nov 29, 2017 (gmt 0)

Senior Member

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month

joined:Sept 14, 2011
posts:1045
votes: 132


What CMS doesn't have limitations, its just a bullshÏt statement thats all, just saying... bit like saying perl, php, apache, javascript and mysql have limitations...

I ain't angry or nothing, just feel I'm wasting breath
2:06 am on Nov 29, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:10136
votes: 1009


GSC is one thing. What do the raw logs actually show? Viewing that data will tell you more precisely what G has or has not done than anything else.

This is a site relaunch on a new domain, this is not a new site. g knows this and the algo probably is taking that into consideration (as much as machines can "consider" anything).

View the old site logs as well. See what is happening there.

You need more info on what is actually happening as opposed to accepting what g is reporting (not a bash!, just different reporting methods!).
5:30 pm on Nov 29, 2017 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member redbar is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Oct 14, 2013
posts:3229
votes: 496


Your site was comparatively small, so Googlebot was able to spider it two or three times


Possibly however I have several sites running into thousands of pages and have noticed in the last few years that Google never, ever indexes everything within the first few weeks, there is always a delay of several months for some reason, interestingly Google has started to index more pages for this specific site this week after a two week lay off, images indexed have increased slightly.

My site, 10,000 pages on WordPress,


For various reasons I stopped using WordPress years ago, enough said.
6:18 pm on Nov 29, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


NickMNS: Thanks for the input, I really appreciate it.

I know I did not present myself clearly in this thread, forgetting I posted other info in two additional separate threads.

1. This is a new domain for rebranding.
2. Old domain got hosed by Panda and Penguin (I had the info on Penguin, exact match anchor text from an edu domain)

I waited 1 and 1/2 years for Google to provide the disallow file. I had to put the Chinese spam in at the time. I missed the .edu exact match anchor text links. Google did not run the Penguin also for another 2 1/2 years.

When Google ran the Penguin algo, all my keywords moved up in the SERPS for the first time in six years.

Correlation does not mean causation. However, given that those were the only changes I made, I assume there's a high degree of probability that it was the Chinese spam and the edu exact match anchor text that caused the problem.

So, problem solved. All I wanted to do was rebrand the site and build a bigger and better site now that I have my algo logic returned. Good content, links etc.

I see no problem with the Redirect 301 links from domain A to domain B because I accumulated all kinds of legitimate links over the course of the prior 15 years that I thought I would lose because of the Penguin penalities I had experienced.

If Google does not pass out Penguin related penalities (or filters, I don't want to mince words) for these types of links anymore, good for everyone who did not have to suffer through the four years of Penguin wars I had to suffer through.

I can only formally address one point at a time on these threads, otherwise, it gets a bit fuzzy. Sincerely, I really appreciate your feedback.
6:26 pm on Nov 29, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


SEOSkunk:What CMS doesn't have limitations, its just a bullshÏt statement thats all, just saying... bit like saying perl, php, apache, javascript and mysql have limitations...

I ain't angry or nothing, just feel I'm wasting breath
--------
I agree completely. I used three different CMS back in the day, starting with Post-Nuke. I switched to HTML and HTML5 Twitter Bootstrap for ten years because I found the CMS to be still too buggy for my tastes.

I switched back to Wordpress because I do think they produce excellent quality coding. I have experimented with 60 or so different plugins that I would also rate as excellent and creative. I don't use them all. I use a minimal number of plugins to make the site as speedy as possible.

Of course there are technical limitations. Code is based on logic. Logic is filled with paradoxes. Logic is not perfect, neither is code (I'm formally trained in Logic, not code).

I'm just trying to see my way through some of the WordPress idiosyncrasies and insure that the site is solid. The 404s that pop up are not technically a problem. It's good they are not coming up as duplicate content.
6:27 pm on Nov 29, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


Redbar: Possibly however I have several sites running into thousands of pages and have noticed in the last few years that Google never, ever indexes everything within the first few weeks, there is always a delay of several months for some reason, interestingly Google has started to index more pages for this specific site this week after a two week lay off, images indexed have increased slightly.

--------

That is very encouraging. If Google is generally slow to index new sites in general, then I have no problem waiting like everyone else.

Glad to see your indexing picked up.
6:33 pm on Nov 29, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


Tangor: GSC is one thing. What do the raw logs actually show? Viewing that data will tell you more precisely what G has or has not done than anything else.

------
Thanks for the good suggestion. I have been busy over the past four weeks researching and writing content, and no doubt checking on GSC too often cuz it increases my anxiety :)

I do need to see the raw logs and how the google bot gets to the site.

I did see the logs from the security plugin. Interestingly enough, I had to already fight off a brute force bot attack. Successfully I might add. Jeeze, what a way to start a new domain.
6:36 pm on Nov 29, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


Engine: My first thought is that the pages on your site map just don't yet have enough "value" from external sources. This often happens with "brand new" domains as you described it.

--------

That sound very logical to my ears also. Here's hoping as I continue writing and building content, I can at least figure out a good marketing strategy as a compliment.
9:50 pm on Nov 29, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15755
votes: 828


itsjustme2:
Psst! Reading these replies will be a whole lot less confusing if you put the stuff you're replying to in
Quote
markup. If you don't see the Quote button, you can type [ quote] and [ /quote] manually (omitting the spaces, of course).
10:37 pm on Nov 29, 2017 (gmt 0)

New User

5+ Year Member

joined:Jan 2, 2014
posts:39
votes: 0


itsjustme2:
Psst! Reading these replies will be a whole lot less confusing if you put the stuff you're replying to in


I've been reading and a member of WW for fifteen years. Always wanted to know how to respond to a particular message rather than whole thread.

Where's the quote button? I'm on a laptop using Chrome.
12:20 am on Nov 30, 2017 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time 5+ Year Member Top Contributors Of The Month

joined:Apr 9, 2011
posts:15755
votes: 828


Where's the quote button?
There are two ways to attain it. One is to click Post Reply instead of Quick Reply. I have never personally used this options, as it's several millimeters further away, but it does get you the full Post window with formatting options. The other is to click Preview after composing (part of) your post. This, too, has all the formatting options.

Unlike some forums, this one doesn't have a direct-reply option that yields something like “itsjustme2 wrote / blahblah” in a ready-made little box of its own. It's just a generic Quote box. But it comes without those silly oversized quotation marks at the beginning, so there is that.
This 34 message thread spans 2 pages: 34
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members