Hey all – been taking a vacation from the blog for a while, sorting out some personal issues. Wanted to bring up a cool meta search engine that I’ve been getting back into. It’s not new, but it’s a cool way to search when your old standby’s aren’t doing the job.
A meta-search engine is a search engine that sends user requests to several other search engines and/or databases and aggregates the results into …
note: image was altered to fit the width of my blog
“Customized for the metro Philly area”, eh? Interesting, except my actual location this time was outside of Philadelphia, in Reading, Pennsylvania – Berks County, not Philadelphia. I’m not exactly sure how the geo-tracking works in this case (I’ll have to look into that), but when I checked my IP path, I’m not running through Philly. Why not choose Harrisburg then? I’m equally close.
SEO and IP aside, I just started to wonder about whether this was a good idea at all.
I wasn’t logged in. I wasn’t asking for personalized search. What if I didn’t want an art program in the Philadelphia area, but rather an art program like Photoshop? Why would I want a customized “local” search? Or, what if I was open to any location? Granted, these results really didn’t seem that customized to Philly this time around, but how far can Google take this?
I’d prefer some parametric buttons that would let me choose customized results to my location, instead of just having it be “on”.
With the surprise news that Adobe hooked Google and Yahoo up with a special reader for the spiders (which allows the engines to parse the .swf files and index/follow deeper content), does that mean the SEO’s PE special weapon can be abandoned?
I’m still going to stick with it for a while for my SEO blog and my clients’ sites. Google is adopting the reader first, and has technically been lightly reading some flash files already, but anyone who’s been in the game long enough knows that a lot of these properties launch with half-powered products all the time. Their track record isn’t stellar, so why not a ‘better safe than sorry’ approach? I don’t think I would consider dropping PE until at least a few more months after MSN jumps aboard. I’m not sure I would cease building products for the PE method (depending on the cost vs. value), and simple on-page coding is so easy – it seems like a no-brainer.
What about the other engines that will never be this advanced? Do you care about them? In preperation for vertical and social searching, I think it’s wise to consider what they could become. I think this news is going to spark a huge influx of flash sites, but I’m thinking this still seems like a bad idea.