July 25, 2006
Google has announced that they are offering live traffic data in their mobile maps application. Just point your mobile device to www.google.com/gmm to download the application.
The latest version of Google Maps for mobile will enable users in the U.S. to view comprehensive information on traffic conditions in more than 30 major metropolitan areas–and partial information in many others–right from their mobile devices. To get information on traffic conditions in a particular area — including San Francisco, Los Angeles, Houston, Chicago, Minneapolis, Phoenix, Seattle, Washington, D.C., and New York City — users simply move to the desired location within the application and select “show traffic” in the menu. The most up-to-date traffic information will be sent directly to the users’ mobile device, and will highlight the conditions on the covered commuter routes using red, yellow, and green overlays.
In addition, when mobile phone users search for driving directions, they will now see the expected drive time as well as any unexpected traffic delays, making travel planning much easier and more effective.
Now the question is why isn’t this feature on Google Maps?
My first thought was that this was a great application that would compete with such applications as those application provided in some vehicles that have real-time traffic in their navigation units (like Acuras and Cadillacs)..except this one was free of recurring monthly payments. I tried it out and it was *very* slow…almost to the point that it was an impractical application (response time slower than the time it takes to drive that area). Most people I know will, instead, continue to use scripts that have already prefetched sigalert data from the caltrans site, have the image presliced to fit within their phone’s browser, and put up on their own webserver for nice, zippy response times.
July 18, 2006
Reports are stating that Google will start supporting the NOODP meta tag that MSN started.
This has the advantage of not using the DMOZ descriptions (Google currently shows it for a site that has an ODP listing.
Here is an example:
Search for Ask’s Site and the first item you see is his site with the following description below it: “Blog by this Perl developer, perl.org webmaster, and Perl Foundation member.” It’s a very weak description of both Ask and his site. Descriptions, which can be suggested along with the site, are often submitted by users or editors and, sometimes, it is claimed that these editors adopt categories they have sites for so they can write bad descriptions and titles for their competition (unknown if this is true).
Here is how you implement it within the HEAD tags of your web page:
<meta NAME="ROBOTS" CONTENT="NOODP">
July 5, 2006
I’ve mentioned how sites like Revver helped you make money by placing ads in the video (which, even when embedded in other pages, would still be viewed and, therefore, credit your account).
Well, eefoof is offering a revenue share with video uploaders using their site:
When you upload a piece of media for submission to eefoof, your first hit immediately starts generating income. Each month, we measure the amount of individual page views for each item you submit, and then calculate the percentage of hits it accounted for its media type. We then use this number to figure out your share of the sites ad revenue. Once your account exceeds $25, we will send you a Paypal transfer to the email specified at account creation.
effoof accepts video, images, flash, and audio.
This business model has been around for a while. forums like digital point have been doing this type of thing for a while and have been successful in increasing adoption and, therefore revenue (revshare means smaller margins but in greater volumes…if you do it successfully).
About a month and a half to two months ago, I transitioned my site to a dedicated host.
I thought I had put in all my crons to backup my mysql databases.
Well, I went ahead and did a knucklehead SQL commit that changed a column I didn’t mean to! I was using the commandline mysql client and was executing a statement but had forgotten to escape a quote in the SQL command so it assumed the where restriction was part of the SQL and, it changed all the rows of the table to have the same exact text for that column.
First thing I said was “no big deal…I’ll just grab last night’s mysqldump since I hadn’t added much data since then…that’s when I found out my backups weren’t there…I looked up my old mysql files on the old server I had transitioned off of but there was a significant amount of data missing.
Thankfully, I was able to extract all the SQL from the mysqlbinlogs. The only problem was that the SQL statements had some updates and some inserts…and the inserts did not reference any sort of primary key within the table (they were auto_incremented).
I ended up writing a perl script that parsed the SQL statements, used other columns that, when combined, should be unique, and used those to regenerate and execute SQL against the table to update (rather than reinsert) the data.
I could have also used the older mysql files off of the old server and just rerun all the SQL from the binlogs but I didn’t want to risk losing an inserts I may have had during transitions or when I was moving around mysql files when I was optimizing my database when I first moved the files over.
Anyway, everything was 100% recovered, thank goodness!