Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
I wonder how many sneaky new things are being thought up to be implemented in the next couple of years and am interested to see what people think they possibly could be.
2.5 billion pages in retrospect of "5 years down the road" is small. Is there much value in "we have it all" (not saying though that indexes can't be split).
Theming engines at maybe the size Google is now could become the norm.
IMO kartoo.com is cutting edge (not because of Flash) but because of data organization, comparasion and visualizations.
I don't believe ranking (according to the better manipulator or physical content) will play a big role.
[edited by: fathom at 12:39 pm (utc) on Oct. 29, 2002]
With google banning certain sites due to extreme content, there is likely to be a debate as to where the line should be drawn. Also, with certain countries posing restrictions on internet access, it is likely that some common agreement would be met.
I suspect that the result would be more consistency across the board in five years time, with regards to legislation (defining "unsuitable content", accessablity, etc).
Also the level of knowlege possessed by the average surfer will be much higher, so all the big companies now who dominate on general search terms will see less and less benefit from it (as users are better equiped to find specifically what they want using clearly defined search terms).
Content will continue dominate the value of a website, but with increasing developments in technology, graphics, animation and more importantly, user interaction will be valued higher (posing an interesting task for SE´s! ;)).
I would imagine on one hand you will see SE´s, governments and large orgs looking to create a "consistent web", whereas users (now knowledgable in what can be offered) will demand more varied services.
Just my rant though! ;)
Its a hard thing to judge - all that can be done is to wildly speculate based on limited facts.
We cant even guess what Google is going to do with this update, let alone in 5 years time (it could be even Microsoft Google by then.... ;))!
Another wild theory I had once was that the Internet is likely to become too large and messy to be effectively organised and may just be dumped.
I read something about the "Grid" which was being developed by academics (in the UK i think - perhaps worldwide though) and was being billed as the Internet 2.
<in the style of movie preview voiceover guy>
Internet 2 - death to the web! </>
If that were to happen we could see not only websites competing with each other, but Internets! :) How much of a SEO nightmare would that be?
In fact, taking it a stage further, Web vs Grid vs WAP vs Digital TV vs Games Consoles vs The Next Big Thing...
Im getting out of this business. Anyone need some furniture painted? :)
geez, I remeber when I thought my Amstrad with tape drive and 16 colours was the cream. took half an hour to load my favourite game.
Recently had a chat with a phisycist fried of mine, she's working currently on the use of protons instead of electrons for data transfer and is complaining that her system keeps crashing because she keeps on adding up above 32bit floating point numbers.
So, back to the future, with more and more networks being arranged on the basis of information association you might see changes, not only in the way information is being searched for but big changes in how it is presented in the first place. just imagine the whole game changes and instead of using filenames we'd only be using file categories. Would bring a whole new dimension to the search engine algorythms.
if you can predict where this is going to be heading, (what is the name for 1024 TeraBytes by the way?) you'd have a good chance making big bucks on the stockmarket.
Just MHO. :-)
good sci fi book as well. :)
Brings up another interesting issue as well - a group of people control the way that the galaxy develops by limiting and controling the information. They end up controling with fear as a religious type group (people called the "priests" were called to fix the computers and run the power stations, etc).
Building a website will be as easy as sending an email is today.
The SERPS will contain a larger mix of top-level domain names. .COM will not be as important as it is now.
China will be all over the internet (and search engines).
Search engines will do a better job translating.
You'll need 50,000 incoming links to get PageRank 4
Human directories will not be able keep up, but they'll list the more important sites.
There will be much more multimedia.
I think, the search engines will get better to see, if the text on a page is making sense or if it are only keywords to get a googd position.
The spam filters will get better and better.
Someone out there who agrees?
More online gaming for a start. Virtual tours? Completely customisable websites (like what john316 said - customisable from your desktop). Probably quite a few things that we cant concieve of just now!
I reckon websites will become much more interactive on the whole.
I think the Library of Congress attempts also to index all published documents right? (Not sure if just in English or whatever - no expert) But i als beleive the amount of digital information surpassed the ammount of available paper based info about 10 to 15 years ago. To expect one engine to rank all that material reliably for many different uses may be difficult. Google is showing strains already.
Just some ideas thrown in for good measure.. I wont fight to my death for them!
there might be more multimedia in a few years. But i thik, that the relevant sites dont use it.
there is some content, that would get better when it has pictures or is animated.
But the most content will be plain text.
I grew up in the 1950s and early 1960s, when university courses were being offered on TV and a lot of people thought television (along with "teaching machines") would play a major role in education. The predictions didn't come true. For that matter, it was only a few years ago that CD-ROMs were being hyped for their multimedia capabilities. The multimedia CD-ROM market never really took off, partly because of the Internet but also because multimedia CD-ROMs weren't a very efficient way to deliver information.
On the Web, multimedia is useful as a supplement to text, but it's seldom a useful replacement for text. Take a travel site on London: Travelers might enjoy watching video of the Changing of the Guard at Buckingham Palace and a boat tour of the Thames, but when they're looking for hotel recommendations or museum opening hours, it's much more efficient for them to read text--whether that text is on a Web page, in a guidebook, or in a magazine article. Five years from now, video may be delivered more smoothly and at higher resolution, but that won't change the fact that a video isn't the best way to learn where one can find a nice double room for XXX pounds a night within walking distance of Harrods.
By the way, I think it's important that people distinguish between "the Internet" and "the Web." The Internet is a means of delivery; the Web is a specific medium. To put it another way, the Internet is like the Post Office, while the Web is like the magazine that arrives in your mailbox. Of course, the Web is often used as a front end to the Internet; an example would be a Web link that starts a file download. So an outfit like Netflix or Blockbuster might well deliver movies via the Internet in the future, using its Web site as the ordering mechanism. But that doesn't mean movies will be taking over the Web--it just means the Web will be used as a way for people to look up, order, and request movies, just as it's a way for people to order airline tickets or book hotel rooms now.
I'd like to see better translations because as more and more people go online internationally, I'd like access to knowledge that's currently buried (for me) in other languages.
The search engines, whether Google or not, will have even greater relevancy algorithms and possibly be aided by SEO people in making sites easier to index. (Our relationship does NOT have to be adversarial.) There will be duality of purpose in sharing what people search on (key phrases, current events/culture/slang) and what a site is trying to be found for (products/services/info). Reciprocity of tools & design to ultimately aid the public but ultimately, the data is primarily text-based.
Data via other methods (e.g. graphics) may be pushed by disabled groups first- and then find applications for wider markets (e.g. imagine cooking in your kitchen and having search results "read" to you while you voice commands to visit site X to see if it's a good match). Or being a parent and looking for visuals of West Nile Disease symptoms online.
People (/businesses, /governments) will attempt to commercialize or censor data and communication. This will be on-going. But simplicity and sheer numbers of massed voices will keep the debate (and free info) alive.
Imagine paying to use Google? Imagine paying to access your current favorite bloggers? Or other sources of data? Micro-fees would not be horrible IF they were truly micro and we were essentially paying for convenience, good management of an excellent & useful product, etc. But our online community is generally best served when info is free and shared as opposed to "paid for". Plus, I'd hate to see even further segregating of the online community into the "haves" and "have-nots". Say I'm a kid looking for info on molestation and finding help? I don't have money or access to electronic money but I still need the data.
We can't get away from appointing experts or sources to manage the data (just too huge). Their proliferation will rise. But as long as open communication remains, enough voices can check, critique, or question the data and its relevancy.
Last, I'm surprised no one has mentioned PDAs or cell phones. Web access and searches will rise for these. Hopefully, we won't have too many standards to optimize for (e.g. xml). Search engines, web sites, etc. will have to condense data to fit the screen (& other) limitations and serve up even faster results.
5 years from now, we'll all still have jobs if we want them and have managed to move with the times. Because parsing data & translating the techniques that move data or improve a site's relevancy or ability to be found- will still be valuable.
what is the name for 1024 TeraBytes by the way?
1024 terabytes is 1 petabyte.
1024 petabytes is 1 exabyte
1024 exabytes is 1 zettabyte
1024 zettabytes is 1 yottabyte.
Then we run out of names, I guess.
These may sound funny now, just like "gigabyte" used to a few years ago, but we'll likely be using the first two or three of them soon enough.
It seems as though google will come out with their alleged Product Search [webmasterworld.com]. I think this is a good start. I have actually been throwing the idea around for many months now with some colleagues and we had plans to possibly develop a solution for this. Although since we found out google was doing it, we decided against it because it would be very difficult to compete with them. However, I still do not think they would be utilizing the idea to it's full extent as a result of some business related factors that may hinder the development to it's full potential (at I don't think it will be utilized by google to it's fullest extent in the next couple of years).
However, eventually Artificial Intelligence will come into play and will make today's methods seem like child's play.
MS Windows 2006 Fifth Edition will come nicely equipped with a handful of choices (but not optional) for your very own 'guide'...this half-way-there AI digital avatar will guide you through all your tasks and functions.
you need to search the web? your avatar will ask you whether you are looking for an information or a product/service to purchase...you specify your motive, then your avatar will offer you a SERP (which will be the SERP of partnered/branded SE of MS).
As you ponder upon the choices, your avatar will offer you an additional information...this information is generated through the complex partnerships the MS and MS' partner SE...your search query will be sent to a central database...there your search query will be met with pre-selected advertiser or whoever was willing to pay to have their information offered through this 'affiliate' program...this concept is similar to PPC...but the difference is that it's real time/query by query based...and perhaps maybe a little more targetted...but that's a big maybe;)
as you publish your web site/page(s), you are required to specify your site/page(s) as either content or commerce...violation of this code will cause you to..not to be banned from the index...but you will have to cough up some fines...continous violation of the practice will result in temporary removal from the index...and to be re-indexed, you will be asked to cough up some more fees
I am going back to my lunch now:)
One interesting point for me was the point that Internet users are going to become more "search-savvy", meaning that they (we) will actually be 'trained' to search using exactly what we want to search for. As always with issues like this, it brings up the point to wonder whether this training is just happening as a by-product of the technology, or whether there is someone or some persons who have created this global 'training program' intentionally.
I travel a lot, and all over the world, the younger generations are much more like each other than the older generations...growing up watching the same kinds of TV, eating the same kinds of food (thanks to McDeath), and becoming generally more homogeneous.
I've kind of gotten away from the topic, but it is interesting to consider what the technology changes on a global scale are doing to/for society, and it's both scary and fascinating to wonder whether those changes are simply happening or whether there is some degree of intentional social engineering going on by many of those who control the most powerful forces in technology.
It remains to be seen... but thank-you to all of you for your great thoughts and comments.
The PageRank concept is one such area where there are many interesting possibilities. Here is one basic example of how one could customize PageRank: Currently, PageRank gives every page equal weight and then churns through the PageRank math. Perhaps you could customize a search for say blue widgets by modifying PageRank by only giving an initial PageRank to pages that contained blue widgets on the page and then churn through the PageRank math. Then you could combine this modified PageRank with other variables etc.
The list could go on and on and there are a tremendous number of ways to customize the variables. This could get a bit too involved for the average searcher but this is where the web community could step up to the plate. Webmasters could develop their own super advanced search settings and then post it for others to use. Maybe search engines could give incentives for people to develop good super advanced algorithm settings. The more the settings are used, the more the creator of the settings could benefit. It would kind of turn the whole algorithm tweaking process into a free market system. If one set of settings gets spammed out by ruthless SEO’s then the searchers will eventually migrate to another set of settings that others have developed or modified.
Currently, search engines play a cat and mouse game with the spamming type of SEOs. Maybe future search engines could turn it into a game of cat and mouse between the SEOs and SESCs (search engine settings creators).:) Maybe search engines could find ways to measure the “happiness” of searchers who use various settings. Then search engines could play the role of ranking the best search settings for its searchers to use.
Regardless, it is fun to think about:)
Google somewhat measures for this but it appears pretty imprecise. Simply leaving or not returning to a search doesn't mean you've met with success (can't be taken for granted). Landing on one site and not returning doesn't mean it's the best match.
Next step needed to evaluate is something that appears (ack! anything but a pop-up) when the user is exiting the URL clicked on (could be the domain in general and not simply the original URL result). Positive evaluations could increase its relevancy factor for that keyword (or keyword phrase). It's got to be easy to do (for the user) in a non-annoying/non-obtrusive manner.
Too bad there's not a consortium of SEs to share data. That'd be very interesting.