Archive for the_time('F Y')
30 November 2005 in Search Technology | Comments (0)
Originally search engines were all about recall – the number of results that got returned for a query. That was fine, but you often didn’t get results you were actually looking for.
Then some chaps from Stanford brought us the wonder of Google – both great recall and pretty good precision (relevance).
The next big area in search will be context sensitive searches.
Context sensitive search means taking into account what you mean with your search. An example that I used recently was if I search on Google for “All Blacks”. Google has no idea if I want to find the All Black website, buy All Black shirts, find the score to the last game, become and All Black etc. It has no context surrounding my search.
I was pleased to see that Yahoo have started investigating this area (I’m sure they all are but Yahoo appears to be the only one with anything publically visible in this domain). They have a beta search called Yahoo Mindset. It’s quite cool – at the moment it’s targeted at allowing you to select a variance between online shopping and academic research in the results displayed.
It’s very cool – Yahoo have adopted AJAX for this search engine so as you adjust the slider the results automatically reorder themselves based on the context in which you’re searching. I haven’t done too much searching with it yet so I can’t say much on how accurate it is once the context is applied. I’m really hanging out to see them move forward with this – perhaps several contextual sliders, some intelligence around what context types are available for the type of search done (for example, in a specialised area like Apple Computers, to get a slider of “hardware” and “software” would be great).
It’s evident that the data is still being analysed for this new search – when I did a search on C# I was informed that they were currently analysing this search and to retry in a few moments to get more results. It’s very cool to see that sort of transparency in what is going on.
- JD
28 November 2005 in General | Comments (2)
A few months back I got around to buying an nVidia 6800 graphics card. It is a reasonably performant card and should have all the bells and whistles to ensure most games run pretty well. I started off using it without any problems – mostly on older games that hadn’t run too smoothly on my old Radeon 9600 card (Sim City 4 etc).
However, over time I started trying some of the newer games like FarCry, Serious Sam II, Age of Empires 3. I was totally unimpressed at the problems that cropped up. Games like FarCry would run well for maybe 10 minutes and then start to get artifacts and become unplayable. Even more modern games like Age Of Empires 3 would actually hard lock my machine before I could even get into the game. Considering how much I paid for the card I was really unhappy.
My real challenge was that I don’t want to try and return the card and not be able to prove it’s broken – as it does work for most older games.
First off I tried the obvious – different versions of drivers. What I found when I tried to upgrade to the newest version drivers was that my system hard-locked as soon as windows loaded. Not such a good thing. Some reading showed that the latest drivers have serious issues…. with dual core systems – which I don’t have. So I went back to some older drivers so I could at least use Windows.
So I guessed that perhaps the card was overheating. I used the tools that came with my card to check the temperature, it sat around 40 degrees celcius and would go up to around 55 once games were cranking. From what I read this wasn’t too bad for a modern graphics card.
My next theory was that it could be a power issue. These modern surfboard sized graphics cards have their own molex connector to really suck the power down. The monitoring tool that came with the card reported voltages looked about right. They did fluctuate slightly when games ran. The problems continued even when I unplugged all unneeded components (DVD drive, second harddisk etc).
So I went and acquired a new computer case – a reasonably nice Thermaltake case with a lot more cooling and a beefed up power supply (430W). The voltages still varied slightly but I I can now rule that out as being the issue. Also the card now runs about another 5 degrees cooler which is great. Unfortunately the problems persisted.
After a while I started to think about what the games that failed had in common. My older games ran fine. So I figured that most newer games utilise pixel shaders. They’re a fancy new way of programmatically getting the graphics card to do some cool rendering (it’s what makes fancy new graphics so fancy I guess you could say). There is a lot happening in this space and it’s not going to go away. As a side note we’re now up to Shader Model 3.0 (SM 3.0) – according to nVidia my card should support everything up to and including 3.0.
Sure enough, in some of these games I can downgrade the version of the shaders used (or turn them off completely). Doing so makes the games run a lot better – no artifacts or hard locking. My next challenge was getting some hard proof that the card had defective shaders. After doing some hunting on Google I discovered a cool application from Microsoft called Display Compatibility Test Kit (DCT). DCT is the application that hardware vendors can use to test their hardware to ensure that it meets WHQL requirements. 107MB later and it was downloaded onto my machine.
A word of warning for anybody looking to run DCT – If you run all the tests included in the DCT application it can take around 36 hours to complete the testing!
I dug around and found the tests relating to pixel shaders, kicked them off and went to bed. I’ve attached this picture showing what happened (the normal windows logo means it ran successfully, the ones with a red cross are ones that failed). My understanding is that my card should have passed all of these.
I thought I had found some concrete proof of problems. Unfortunately things get a whole lot more murky when I realised that despite nVidia getting WHQL certified by Microsoft their newest cards actually fail the DCT tests (which they are supposed to pass). I still believe that my results are based on damaged pipelines – I actually got up at one point in the night and checked on it (yes – I’m that much of a geek ) and watched some of the tests, sure enough, on the ones that failed they actually show the same corruption (mostly red dots).
If anybody out there actually feels like going to the great length of downloading DCT and running the shader tests on their 6800 and showing me how they faired I’d be pretty stoked At this point I’m going to go old school and just take some digital photos of my computer with the corruption and hope that it will be enough to get a new card. I don’t think they would have too many problems with me getting a replacement but on the off chance I get somebody who wants to actually test it a lot to prove the fault I want to be armed with all the facts I can.
- JD
25 November 2005 in General | Comments (0)
Rowena and I usually exchange gifts for Christmas around the start of December each year as it’s our anniversary (of going out, we’re not married for those who don’t know us ). It seems we both quite like gifts, both giving and receiving, as each year it seems to creep in a bit earlier.
This year Rowena wowed me with an awesome 60GB iPod Video. It’s one of the sexiest gadgets I’ve ever seen. Really cool – I haven’t owned an iPod anything before so this is a great dive into the pool. The screen has really nice colour and quality – it makes it easy for watching things on.
As the first tool in my toolbox for making the iPod video really easy to use is the ImToo iPod Movie Converter. This tool takes in just about any format video you have and converts it really quickly to an iPod Video. You can configure a few settings – enough to make it easy to pick up. It doesn’t rip directly from DVD but with some free tools out there you can rip it to another format to convert. This is software that just works.
So this weekend I’ll spend some time finding out about how the iPod works – I’m really stoked
- JD
24 November 2005 in Intergen & Search Technology | Comments (0)
Last night I delivered a presentation to various Intergen clients about how to improve search both on their websites and their positions on global search engines.
There was a really good turn out of people coming to see what we had to say and I hope that I provided enough detail to get people thinking about their own situations. The detail given in the presentation is reasonably high level and probably won’t appeal to the hardcore geek out there. If anyone is looking for any more detail about a certain area of search or would just like some ideas about how you might go about improving your own search then I would love to hear from you.
To download the presentation click here
- JD
23 November 2005 in Search Technology | Comments (0)
So Google Analytics has finally kicked in and given me some reporting goodness. I thought I’d post a wee bit more about it.
The first thing I found interesting was the geo-tracking. What I’m impressed with is that somehow (if you know how let me know!) Google manages to define the difference between a person browsing my blog from Wellington and somebody browsing from Raumati Beach. I like the idea of being able to track things like loyalty and networks through an application that is not only free but so very slick.
You’ve got exporting options (tab seperated, xml, excel), overview reports and much more. For anybody that is serious about analysing the performance of their website then this is a great tool. Looking at all the options relating to the ad-words functionality it would be absolutely imperitive that you use Google Analytics to ensure you’re tracking as well as you could be in terms of revenue per visitor.
I’ll be presenting this evening at an Intergen Twilight Seminar on improving search (both within your website, and how to improve your global rankings) and this isn’t a tool that I’ll be demonstrating (time constraints) but it should certainly be part of any web masters toolbox.
Link: Google Analytics
– JD
22 November 2005 in Search Technology | Comments (0)
Recently I’ve been plugging in the free tools from Google.
First off, Google SiteMaps. Using SiteMaps I can…
- Tell Google what pages are in my site
- Rank pages within my site on how important I think they are
- See details on what Google has given me for a PageRank
- See what searches people use which brings up my site
- See what searches people use when they actually click the link
- Ping Google when I make changes to certain pages
- Tell Google how often to recheck certain pages (homepage, comments, static pages etc.)
It doesn’t look as polished as Google Analytics but the information and functionality you gain from using Google SiteMaps is great. As a side note, if you’re using WordPress as your blog software there is a great plug-in that makes using SiteMaps easy. Click here to get it.
Next up is Google Analytics.
Google Analytics focuses a bit more on general site statistics. There is also a lot around adwords and how to better use those however as I don’t use adwords on my blog I don’t know much about this. Personally I just want to look at some pretty graphs to see how I’m tracking.
I’m still waiting for Google to step their game up on Analytics, I signed up on Monday last week and I’m still getting apology messages saying they’re overloaded.
I wouldn’t be suprised if in the longer term the SiteMaps functionality and the Analytics get rolled into one.
If anyone has any feedback on their experiences with these two tools then post below.
– JD
7 November 2005 in .Net & Tools | Comments (0)
A couple of weekends back I got a chance to play with Cuyahoga. I spotted a reference to it on Nic’s blog.
For those of you who don’t know what it is, Cuyahoga is an open source content management system (CMS). I get involved with CMS products from time to time at work (EPiServer and Microsoft Content Management Server) so I like to look at other products in that space. One thing that really impressed me about Cuyahoga was the ability to manage several sites from the one central administration area. I like it when I see some of these enterprise features available in smaller products.
While doing some reading about Cuyahoga I stumbled across an interesting site that might be useful for other people that touch on the CMS space: http://www.cmsmatrix.org/. CMS Matrix provides a comparison tool for comparing features, price etc of heaps of different CMS.
Overall I liked the experience but some of the more basic (expected?) features seemed to be lacking just a bit. I’m looking forward to playing with some of the future versions – it looks like it would be great to leverage for some personal projects. Now, if I could just work out how to pronounce Cuyahoga…
– JD