There are a tonne of write ups (and some great photos) about the Distilled Link Building event that took place on Friday, so I’ll keep this short and sweet: it rocked. There was a great mixture of high level strategy tips, with low level techniques to use.
To illustrate: Reddit is currently piling up the sand bags ready for the onslaught that could come when people follow one strategy given up by the venerable Russ Jones.
Every single speaker had me scribbling down notes as fast as I could manage, and day dreaming about how I could deploy some of their ideas. All the attendees seemed to be in a trance, and it was quite exciting to be there for this groundbreaking link building event.
If you are stateside and you can make it to New Orleans this week I think there are still tickets available for the next iteration of the event. I cannot recommend it enough; I love the SEO community because there is such a great culture of sharing knowledge but I was still surprised by the depth of the amazing tips that some of the speakers were giving away.
After the main event, those who could withstand the urge to go and start trying out their new link building ideas right away headed over to a nearby pub/bowling place for some network, bowling and a few beers.
I met some great people there, and exchanged some cool ideas.
Lastly, if you’ve been over at the new SEOmoz Q&A building up a few mozPoints (damn you Egobait!) and weren’t sure if they’re worth the effort then I have news. No sooner did I complain to Rand that the promised ‘hug from Roger’ (SEOmoz’s mozBot) failed to arrive when I reached the requisite 50 mozPoints than he set upon me himself to make up for it. I could feel the SEO knowledge emanating from him… I think that’s what it was.
See you next year!
After my post Geocoding UK Postcodes with Google Maps API I’ve had a few people contact me about caching geocoding results back to a server, for subsequent pages.
It’s a good question – Google’s geocoder permits you to make 50,000 queries a day, which sounds like a lot. However, that is only 35 a minute, which if you sustain for more than a few minutes, kicks in the limit (apparently…). So you might be interested in caching your results.
If you aren’t fussed about UK geocoding, you can access the regular Google geocoder using HTTP, as documented in the Google Maps Documentation.
So what this short tutorial is going to do is show you how once a postcode has been translated into longitude and latitude, the result can be sent back to your server to be temporarily stored in a database. Then we are going to look at how we can query our own database before we query Google’s, for each result.
Note: I’m told it used to be in Google’s Terms that you couldn’t store geocoding results, but it doesn’t appear to be that way now. However, we are only going to store them temporarily, so we don’t have to hit Google’s server repeatedly for the same query.
If you are eager just to see how this all will work, you can go straight to the demo page.
So, lets get going…
Notice: As a few people have pointed out, this announcement from Google means Geocoding is now built in. Yet as more people have pointed out – it kinda sucks accuracy wise (think over a mile off on some postcodes!), whereas my method continues to be accurate.
Google Maps API provides a geocoding feature, for finding the latitude and longitude of places or addresses; but it does not work for UK postcodes. This is thanks to Royal Mail who have a copyright on the data, and are very restrictive with their (expensive) licenses for it.
There are various solutions out there for using 3rd party services and importing the data to be used with Google Maps, or for using community built databases for the info. However, I’ve had a few people ask me about doing it just though Google.
It is possible — Google AJAX Search API does provide geocoding for UK postcodes. We need to use the two APIs in harmony to achieve our result.
So here it is.
Is 2006 the Year of the Mashup? I think not. Mashups are at the stage that DHTML was at before it matured into Web 2.0 – lots of bells and whistles, but little real meat. 2006 will see people acclimatise themselves to the principles and technology, and maybe the best thought out mashups will become established. But it will be 2007 that will truly be the Year of the Mashup.
Programmableweb.com currently tracks 979 mashups, and 269 APIs, and shows a rate of 2.7 new mashups per day, so mashups are obviously very popular, so what’s the problem?
d.Construct 2006 is over and it seems a great deal of fun was had by all.
Lots of people have commented on the day as a whole, so I’ll save repeating what they have said. To read them check out Technorati or Aral Balkan has made a list.
Overall, I had an fantastic time down in Brighton, the presentations were both fun and informative and I had a great time at the after party, meeting some awesome people who I really hope to meet again.
An extra bonus of the day was I won a prize for the Morse Code message I left for the podcast. I won several books, provided by Apress, all of which will be very useful.
On a side note: The Mac to PC ratio was about 9:1, which was much higher than expected. I’ve always been an Apple fan, and worked for Ambrosia Software for a number of years before moving to the web industry, so it was great to see the glowing Apples everywhere I looked. 🙂
If you’ve been listening to the d.Construct podcast prior to the Brighton based web conference, you’ll know that Jeremy Keith got a strange Odeo message he pondered might be morse code.
Well, that one wasn’t, but Jeremy appealed for more Odeo messages, so I thought it would be fun to send him a real one.