Forum Moderators: coopster

Message Too Old, No Replies

PHP with SEO?

         

hawkerz

4:45 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



I'm pretty new to the whole concept of 'SEO,' but I just recently began doing some heavy research on the subject matter. What I can't seem to figure out is whether or not search engines will read dynamic content in PHP.
As an Example:

I have a page of indexes (for our purposes, I'll say I'm indexing 20,000 items) and I want all of them to show up. I have two options. One option is that I can use a static index page of part numbers, with links to a page that will use _GET to pull the part # out of the url and insert it all over the webpage. This way, php generates the page based on page.php?partnum=ABCDEFG1234567 by changing meta tags, title, etc.

My other option is that i can create each of those 20,000 pages independently as static pages, with a link of say, ABCDEFG1234567.HTML instead, making them undynamic. Which is better for SEO purposes?

mattclayb

5:02 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



It is generally thought that URL's containing readable text relating to page content is better for SEO than dynamic URLs.

This can be achieved using the mod_rewrite function in php if you are on a Linux server. (you will also need to look into setting up your .httaccess file properly)

However,

I am begining to think that this is becoming less and less of a problem. Search engine crawlers are extremely sophisticated, (especially googlebot) and have no trouble in rendering dynamic pages and viewing the content as a human would.

So if your content is right, then you shouldn't have a great deal to worry about.

I am of the thought that content is THE most important thing in SEO, remember Google wants sites with good accurate content to perform well. So your SEO campaign should revolve around, "what are the traits of a good popular website?"

willybfriendly

5:06 pm on Jun 14, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Better yet, you can generate urls that look like mysite.com/ABCDEG.htm and using mod rewrite [httpd.apache.org] (assuming you are on Apache) convert those server side to mysite.com/index.php?partnum=ABCDEF

hawkerz

5:19 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



I'm actually on IIS for the first time in my life, unfortunately, and the company i'm working for isn't interested in switching over to linux, but i'm making due with what I have.

As far as mod_rewrite is concerned, it's not an issue. I have text files containing part numbers (there are 20,000,000 in reality) and they are seperated by line breaks (\r). I wrote a script in php that uses fgets to parse data from the text files, so the part number can be stored to a variable and used over and over again with no issue. My only problem was with space and resources, but the owner of the company has assured me that I have whatever I need at my disposal

mattclayb

6:17 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



Aside from SEO,

Why do you not store the part numbers in an sql database, and use a php script to extract the data and do what ever you want with it. That would be alot more efficient.

So are you concerned that robots will not be crawling the text files, and therefore not reading the part numbers?

If your script is open to the crawlers and it outputs the part numbers onto the rendered page then you shouldn't have a problem.

hawkerz

6:38 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



that theory is totally against anything i've read about SEO, which all says no dynamic pages, no referencing databases, etc

netchicken1

6:49 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



I can't mod rewrite my php site, but google is always indexing it, make a sitemap and submit it to google, have a vanilla html page to add your links to.

mattclayb

7:05 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



Well, you'll find that lots of people have different opinions with what works in SEO.

Like I said earlier it is generally thought that dynamic URLs are less effective than 'plane' URLs that contain keywords relating to page content.

However, I am finding that this is becoming less and less of an issue especially with google. Take, the bbc for example, bbc.co.uk performs exceptionally well in almost every search, yet it does not use any 'SEO friendly' urls, they are all dynamic.

It performs well, because it is a popular site with excellent content, which brings me back to my point. - "Think about what traits does a good popular website have?"

This is also normally specific to the area you operate in, a popular news site would have different characteristics to a popular website selling widgets, and the SE have sophisticated algorithms that test these characteristics.

The number one rule is providing good content for your area of business.

I do agree though that having keywords in URLs are important from a user point of view. If they see their searchterm in the URL on the SE results they are more likely to click on that link.

hawkerz

7:09 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



Well the issue i'm facing is mostly that the industry i'm involved in presently isn't a very popular industry - we do commercial work on a pretty large scale but we have a very small group to target - it's unlikely for us to get 'free links' anywhere - the only people that would have similar content to us would be our competitors, and we are useless to the average consumer, we target large corporations. So, having 'good content' is pretty irrelevant in this industry because anyone who finds the site will already know what it was they were looking for.

mattclayb

7:36 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



'Good content' is not having alot of content. Also being in a slightly obscure industry is actually a huge advantage.

You're half way there already, becasue you're considering your target, and what they need is the content your provide.

Research is very important, find out who does the initial research into your product, and what decisisions they need to make to buy / contact you.

Look at your competitors, what are they providing, and most importantly what are they not providing. Try to be innovative in the content and delivery of your content, but in a useful way.

After you have researched your target and what they are looking for, reasearch keywords. Search the keywords, look at what comes up, and study how the pages are built - copy it. Look at what sites they link to and from, contact them and convince them to link to you (or instead).

With linking, quality not quantity is most important. A few good quality back links can do wonders. Look at sites that provide info in your industry, are there journals? Also try PR, do you have a new product / interesting news - send them a story asking for a credit and link, or pick up the phone and speak direct, this is normally the most effective method.

Aim high, try government / public body / high profile brands. Try not to 'link exchange' spiders detect reciprocal links and downgrade them. If you do reciprocal link, keep them contextualised in content and not on a links page. NEVER automate linking or linkfarm.

Start a newsletter, or write for an industry related blog.

Make sure you've got all the basics in place on your site, correctly configured robots.txt, sitemap, meta tags (with no spamming). MANUALLY submit your site to directories and search engines, monitor your site and make sure the robots are regularly crawling your content, check broken links and 404 errors.

hawkerz

7:55 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



Well, I'm doing the best I can. The owner of the company has a specific idea and that idea involves generating millions of HTML pages which are sort of relevant, but he wants to have static pages with all of the part numbers. Hey, I just do what I'm told. He mentioned some backlinking to me, but I'm no industry expert so as far as some of the other stuff you mentioned, I'll have to talk it over with him.

I know what customers are looking for - most likely, in this particular industry, a customer would just type in the part number into google and pick whatever looks the best.

As it were, the site i started with was all flash, so i guess we'll have to see...

mattclayb

8:09 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



Well if users are most likely to enter a specific part number then logging all part numbers in the site seems like an excellent idea, you can't get more targeted than that, and offering that many part numbers is surely going to cover a wide search area.

If you go down the route of storing the data in a database, you will need to make sure the pages are 'renderable' by a search engine.

For example, don't just have the data available through a search or a form, as a robot will not submit a form. Make sure you have some sort of menu system or directory through links that a user or 'robot' can crawl through. This way, it will not matter if this is dynamically driven, as the robot will render the page (like your browser) and then crawl each link, and eventually crawling each part number.

If you setup your script properly to output the part numbers in important areas ie links and <h*> tags then this will give the part number high importance. Also dynamically print the part number into the meta tags.

hawkerz

8:33 pm on Jun 14, 2007 (gmt 0)

10+ Year Member



well, i had figured on indexing all the parts. The problem is that i now have 20,000,000 html pages and google prefers only 100 links per page...how do i go about indexing them? I wanted to have an A-Z 0-9 type of index at the bottom of each page...but i would need something like 20,000 pages to accomplish what i want to, which is 572 pages per character/letter, so i'd need 5 or 6 pages for each letter of the alphabet just to INDEX the pages that index the pages...

mattclayb

3:54 pm on Jun 19, 2007 (gmt 0)

10+ Year Member



You can produce multiple sitemap files, containing info on all the pages you would like crawling (each sitemap file can contain a max of 50,000 URL's). You can then point to each sitemap in your robots.txt file.

The bots will then crawl all the pages.

This will mean that, you won't necessarily have to have direct links to all pages

hawkerz

3:59 pm on Jun 19, 2007 (gmt 0)

10+ Year Member



Well the way I decided to go with it is pretty interesting. I have indexes on the bottom of the page, letters a - z 0-9. When i finish I'll need to use all of them, for now i'm only using A. When you click A there is a list of 192 seperate index pages (a1-a192) and on each index page is a direct link to 300 of the parts in HTML format. If you click on any part, embedded in the page is also a direct link to the part before it and the part after it, making it VERY crawlable. SE follows links, they all link forward and backward to each other.

mattclayb

4:02 pm on Jun 19, 2007 (gmt 0)

10+ Year Member



You will also need a sitemap index file, pointing to each sitemap.

This file can link to upto 1000 sitemaps.

So in theory, you can give any crawling bot, information on upto, 50,000,000 pages.

hawkerz

4:06 pm on Jun 19, 2007 (gmt 0)

10+ Year Member



On that same note couldn't i have a sitemap linking to sitemaps etc etc?

mattclayb

4:51 pm on Jun 19, 2007 (gmt 0)

10+ Year Member



yes, that would work