Forum Moderators: mack

Message Too Old, No Replies

Bing showing nothing for search traffic in the last 10 days

Bing appears to have deindexed or is not tracking any traffic to my site

         

scscamper

8:20 pm on Jul 8, 2014 (gmt 0)

10+ Year Member



Hi, our website was tracking just fine in Bing serps with no problems for quite a while, and then we checked the other day, and all of a sudden, we are showing absolutely nothing when we check page traffic to our site in Bing Webmaster Tools, since approx the 27th-28th of June, so just under two weeks ago now.

Has anyone else had this issue? Any clue what could be causing it? We made a change last week in which we changed our charset and doctype to reflect the latest in HTML5. This resulted in a couple of issues.

1) There were some extra characters that appeared to make it so that the META head information was not being seen by search engines. We had made the change on the 26th, noted the issue on the 3rd, and corrected this issue on the 3rd.
2)We forgot to include a LANG attribute in the HTML tag.

We did notice yesterday on the 6th using the Bing SEO Analyzer that bing was unable to read our meta information. This was when we noted our mistake with the LANG attribue, and fixed it, so now the SEO Analyzer shows our meta information. But we wonder just how much damage we did to our site in Bing with that issue being on site.

We have since remedied this in the last couple days, but do you think this might have had something to do with it?

Any help would be SO greatly appreciated guys. We need any help possible to get us re-indexed to Bing. We have lost 66% of our bing traffic. We have resubmitted a new site map, and we feel we have remedied the situation properly. Can anyone tell us how long it takes to recover, before we start seeing our pages showing in Bing again.

not2easy

10:54 pm on Jul 8, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



The point in your question that stands out to me is this:
We made a change last week in which we changed our charset and doctype to reflect the latest in HTML5. This resulted in a couple of issues....
There were some extra characters that appeared...
bing was unable to read our meta information

What did you use to actually convert the documents to a different character set or was it just a name change? Documents (including html pages) that are created using a system or application that uses a specific character set is what decides the charset tag, not what is popular. If your document was created in Microsoft Frontpage using WIN-1252 (for example) you will need to open and convert it to a different character set before you can claim it is UTF-8 or ISO-8859-1 or something. That is because it needs to accurately tell the browser (not the search engine) what encoding the page is built with in order for a browser to be able to decipher it and render it as intended. From the issues it describes, it really sounds like that is all that happened and as soon as you can correct that little boo-boo it will probably be OK again.

Like charset tags, the doctype needs to reflect what is actually on the page and not what you would like it to be so be sure to validate your pages and css or some browsers will apply quirks mode that can be surprising in their output.
These are some html 101 lessons that I recall learning the hard way myself.

scscamper

2:39 am on Jul 9, 2014 (gmt 0)

10+ Year Member



Thank you for the response not2easy. The pages were created in Adobe Dreamweaver, and edited there as well. However, what we had before was the following:

<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<html xmlns="http://www.w3.org/1999/xhtml">


We have changed it to this:
<!DOCTYPE HTML>
<html>
<head>
<meta http-equiv="content-language" content="en-us">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">


The reason we changed it, was because the headers being output on our site, were saying the pages were encoded in UTF-8, but we had either no meta name on each page, or it was defined as ISO_8859-1.

Was there another way we should have handled that? Thanks for your help!

tangor

3:34 am on Jul 9, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



With changes like that.... Usually when I change a DOCTYPE I do it a few pages at a time and move forward from there using that DOCTYPE. I don't change any of the the EXISTING DOCTYPES for pages in use as there usually is no need.

Is this a dynamic site (generated from DB, etc)? if so, you might have confused Bing (and anyone else!).

Also, if your one page html contradicts the new DOCTYPE your page reverts to QUIRKS MODE... and that can be confusions as well.

not2easy

4:44 am on Jul 9, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



If you are actually converting the xhtml to html5, try changing that to:

<!DOCTYPE html>
<html lang="en">
<head>
<title>Your Title</title..

and leave off the UTF-8 part, since it seems it was accepted without an identity before. That would be preferable to using a tag that does not accurately identify the encoding and causes errors. Browsers are made to do a good job of guessing. When you added that UTF-8 tag you are telling the browser to use that encoding and that can cause errors if it is not correct. It was likely better for you not to have one.

In xhtml there are differences: your <tables (if any) should have had a summary="whatever" tag, that is not valid in html5. You can drop or keep the end slash /> closing tag in html5, but many remove it for smaller files. Old code like valign="top" is invalid. If you actually convert the html to html5, then by all means use the html5 header for your doctype, but just like the charset tag, it is telling the browser how to read the document and if there is a mismatch in what you call it vs. what it is, you will be in quirks mode. Quirks mode isn't just code that doesn't validate, it is code that is interpreted differently on different browsers.

scscamper

5:05 am on Jul 9, 2014 (gmt 0)

10+ Year Member



Great responses guys. Overall, I have not updated all the code on the pages, so much of it is still XHTML. Will the:
<!DOCTYPE html>
<html lang="en">
<head>
be ok to use if there is still XHTML elements? Or should I handle it differently?

not2easy

6:18 am on Jul 9, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



If you do that you will be causing browsers to render your pages in quirks mode. If you have not altered the code on the page to comply as explained previously, why do it, you are asking for troubles you don't need.

lucy24

7:26 am on Jul 9, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



the headers being output on our site, were saying the pages were encoded in UTF-8, but we had either no meta name on each page, or it was defined as ISO_8859-1.

Never mind that. What encoding are the pages actually in? Changing a header-- whether in-page or globally-- does not change the content of a page that already exists.

You should definitely include a charset declaration-- but it has to agree with the real encoding of the real page. If you say nothing, a brower has to guess and it may guess wrong. If the site itself says the wrong thing, the browser will always display the wrong content.*

If your pages are 100% ASCII text, using entities (ugh) for all non-ASCII characters, it doesn't matter what encoding you declare.


* Well, almost always. If a page was made in a one-byte encoding including codepoints that aren't valid in UTF-8, an intelligent browser may decide that a UTF-8 (or other multi-byte) charset declaration can't possibly be right. Or, then again, it may decide to render the page in Cyrillic-DOS or whatever seems handy.

scscamper

2:51 pm on Jul 9, 2014 (gmt 0)

10+ Year Member



Ok, is there anyone here who wouldn't mind a PM? I can send you my URL, and hopefully you can tell me what best practice would be for my pages?

engine

3:13 pm on Jul 9, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Mod note: Let's keep the discussion in the thread. That way we all benefit and the thread will be educational for future members. :)

scscamper

3:55 pm on Jul 9, 2014 (gmt 0)

10+ Year Member



Ok, thanks Engine, that will be no problem. Can I post a bit.ly link to my website on this thread for people to check for me? :) Just want to make sure I'm within the rules!

not2easy

5:20 pm on Jul 9, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



There is one place here where personalized help is available, it requires a subscription and you can get there by clicking the "subscribe" link in the menu at the top of the page. Help here is voluntary and free, but not personalized so that it can be useful to others in the future who have a similar issue and come in to search for help.

If you need to determine what the actual encoding of your pages is and to change the encoding, there are text editors you can use to do this. If you are on Windows, look for Notepad++ which is free. Notepad++ can determine the actual encoding for the file and it can convert the files for you if you like.

scscamper

5:34 pm on Jul 9, 2014 (gmt 0)

10+ Year Member



Not2Easy, we do have the subscription, we have paid for a year subscription. How do I go about submitting for help there? I also don't mind submitting a case study, or copying certain information here so it helps the whole community. I am downloading the Notepad++ to check the files as suggested as well. Thanks!

not2easy

6:36 pm on Jul 9, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Well, in that case you can go right over to [webmasterworld.com...] and ask for review on the exact issues you have outlined here. I see there is a requirement to help others before that perk kicks in. Each forum has its own Charter that spells out the terms for that forum, you can look through there to get started. Good luck in getting this resolved, see you there.

scscamper

9:23 pm on Jul 9, 2014 (gmt 0)

10+ Year Member



Well I downloaded Notepadd++ and so far, it looks like the pages are created as UTF-8. So I think I am ok there!

scscamper

2:59 pm on Jul 10, 2014 (gmt 0)

10+ Year Member



Ok for everyone's reference, I spoke with Lucy24, and this is here response to me:

I looked at some random pages and simply couldn't find any non-ASCII characters. Everything else is expressed as an entity, like &copy; or &rdquo; So it makes no difference what encoding you declare.

Now, if you had a lot of non-ASCII characters, it would make sense to replace the entities with the proper characters like © (&copy; ). And then you would have to make sure the declared encoding matches the actual page content. But as it is, the whole thing is a non-issue.

It's a good habit to express all literal ampersands & as &amp; using the entity. As long as they are free-standing (followed by a space) nothing will break and the validator won't kick up a fuss. But may as well get in the habit.

Oh, and the DTD (doctype) has nothing to do with encoding. If you use the html5 doctype "doctype html" and that's all, there's an alternative shorter format for the text-encoding ("charset") declaration. But everything is backward-compatible, so the old-style "meta charset" form will work fine.

Thanks very much to her for taking a look at it for me!

I guess my biggest question, is why are we seeing such a big downfall in Bing after making such changes? I got an email from Bing Support and they said basically, it is normal to see declines in Bing after making site changes. After several recrawls it will stabilize and normalize. Now the odd thing about that is, I would expect if site changes were toward the postive, wouldn't you see upward motion rather than down? Why get a penalty for changes?

not2easy

6:28 pm on Jul 10, 2014 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



You mentioned that on July 6th that Bing was unable to read your metadata, I don't know if they would get that sorted out quickly or not. Have you revisited the Bing Webmaster Tools and used the analyzer since make your latest changes? That might give you some insight.

scscamper

7:28 pm on Jul 10, 2014 (gmt 0)

10+ Year Member



Not2Easy,

Yes we have checked all relevant pages in the Bing SEO analyzer located in their webmaster tools. We did resubmit the XML sitemap on the 8th of July but still no traffic back from Bing. We know Bing reacts a little slower than Google at indexing pages.

We also reached out to Bing via their email support. The team responded that after Bing crawls the website several more times we should start to see the site regain its traffic. They were also very kind as to provide several links to Bing resources in helping to improve the rankings in Bing.

So here is to hoping that our site recovers quickly in Bing.

To all who helped and responded to us thank you and have a great day.

tangor

11:55 pm on Jul 10, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



After several recrawls it will stabilize and normalize. Now the odd thing about that is, I would expect if site changes were toward the postive, wouldn't you see upward motion rather than down? Why get a penalty for changes?


Not necessarily a penalty, but more of a wait and see what else might be happening. Big changes of any kind will result in actions such as that.

One site I managed (a few years back) was originally since 1999 in FRAMESET layout. Changed to standard html with css layout and it took a year to regain rank. Nothing else changed, but that change, by itself, was dramatic for that website. Might be the same thing occurring here.