October 21, 2008

Mapquest iPhone Site Launches

Mapquest announced that they launched their iPhone/iPod touch site.

I went to check it out. Just go to their homepage with an iphone and you’ll be taken there automagically.

The web-based application is pretty slick, acting like Google’s app that sits on the deck by allowing you to use your finger to pan around, etc. Other than a bit of lag time during some requests, I’m surprised it hasn’t crashed on me yet…Safari tends to be fickle with lots of sites utilizing enough javascript. I do, however, wish that they pre-populated the forms based on prior searches (conducting a map lookup then going to the directions page, or vice versa, loses your input). They also solved another issue I’ve had with the iPhone…clearing out the form. Check out the “x” inside the input form elements that you can click to clear the form. Good work, guys!

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
 
Loading ... Loading ...

October 15, 2008

What is a Pagerank 9 Site Worth To You?

Sorry, readers…I know it has been a while but I have been hard at work on another project related to YellowBot. Stay tuned to find out more about what that is…In the meantime, I’m hoping I can resume blogging regularly…

So I was looking at the Clearspring acquisition of AddThis.

AddThis is 2 years old and provides a bookmarking widget that you must have seen all over the place (see the “Bookmark” button at the bottom of this post). The AddThis homepage boasts “served 20 billion times per month” and lists the following as “Clients & Partners:”

TIME, Oracle, TechCrunch, Freewebs, Entertainment Weekly, Topix, Lonely Planet, MySpace, PGA Tour, Tower Records, Squidoo, Zappos, FOX, ABC, CBS, Glamour, WebMD, American Idol, HitsLink, Widgetbox, Template Monster, GetAFreelancer, ReadWriteWeb, Brothersoft, E! Online, iGuard

AddThis widgets/Links are everywhere. It’s no surprise it is a pagerank 9!

When Clearspring, the widget company, acquired them, do you think that discussion came up? Cleaspring increased its reach online but how many visitors, as a ratio of its overall traffic, actually go to the homepage? Users see the widget, some click on a link and are redirected to the appropriate bookmarking service. Site owners might log in to their account to see traffic and statistics. In any case, I’m not sure what Clearspring paid for the acquisition but it was probably a smart buy considering its popularity and traffic levels.

from AllThingsD:

But Ted Leonsis, chairman of the board at Clearspring, and CEO Hooman Radfar said revenue would come via advertising and, eventually, valuable data analytics the services collect about Web behavior.

Currently, said Leonsis, AddThis has negligible revenue and Clearspring has about $10 million in annual sales. Neither is currently profitable.

Clearspring has about 100 employees and AddThis has a handful. I have not seen any articles about AddThis getting funded and Clearsping has received $35 million.

ShareThis is a similar service that some people like better because it also allows sharing via email (in addition to other services). It’s currently a pagerank 7 and has been funded, receiving a total of $21 million.

So what would you have paid to acquire AddThis (with a pagerank 9, even if they lost some traffic over the last few months) and put your link(s) at the bottom of their page?

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5 out of 5)
 
Loading ... Loading ...

February 15, 2008

Inside EveryBlock

Rex Sorgatz had An Interview with Adrian Holovaty. Adrian created ChicagoCrime.org (now redirects to his new venture, EveryBlock) back when we would have to use hacked methods (rather than an API) to use google maps (such as the Google Maps Standalone method).

Adrian talks about some of the challenges at EveryBlock which definitely rang a bell with me. Here are a few interesting passages that developers in the local space and/or aggregators of data may be able to relate with:


One of our post-launch priorities is to clean up the fire-hose of raw information, to introduce concepts of priority and improved relevance — but I do think there’s a certain appeal to that raw dump of “here’s everything that’s happened around this address, in simple, reverse-chronological order.” When significant events happen, they sort of “POP out” of the list.

The first layer is the army of scripts that compile data from all over the Web. This includes public APIs, private APIs, screen-scraping the “deep Web,” crawling news sites, plus harvesting data from PDFs and other non-Web-friendly documents. Some data also comes to us manually, like in spreadsheets e-mailed to us on a weekly basis. For each bit of data, we determine geographic relevance and normalize it so that it fits into our system.

The second layer is the data storage layer, which we built in a way that can handle an arbitrary number of data types, each with arbitrary attributes. For example, a restaurant inspection has a violation (or multiple violations), whereas a crime has a crime type (e.g., homicide). Of course, we want to be able to query across that whole database to get a geographic “slice,” so there’s a strong geo focus baked into everything.

The user interface was, and continues to be, a challenge. How do you display so many disparate pieces of data together, without overwhelming people?

Dealing with structured data is relatively easy, but attempting to determine structure from unstructured data is a challenge. The main example of unstructured data parsing is our geocoding of news articles. We do a pretty good job here, but we’re not crawling all of the sources we want to crawl — again, there’s a lot of room to grow.

On a completely different note, it’s been a challenge to acquire data from governments. We (namely Dan, our People Person) have been working since July to request formal data feeds from various agencies, and we’ve run into many roadblocks there, from the political to the technical. We expected that, of course, but the expectation doesn’t make it any less of a challenge.

Rather than use Google Maps or Microsoft’s Virtual Earth, you built your own mapping service application. Why?

That, along with “When will you bring EveryBlock to city XXX?”, is by far the most frequently asked question we get. Paul, our developer in charge of maps, is working on an article explaining our reasoning, so I don’t want to steal his thunder. I’ll just say that the existing free maps APIs are optimized for driving directions and wayfinding, not for data visualization. And, besides, having non-clichéd maps is an easy way to set yourself apart. Google Maps is so 2005. ;-)

We use an open-source library called Mapnik to render the maps, so that library does the heavy lifting for us. Paul is also working on a how-to article, in the spirit of giving back to the open-source community, that explains how to use Mapnik.

I strongly suspect we’ll have an API eventually

[via kottke]

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
 
Loading ... Loading ...

October 12, 2007

Los Angeles Tech Events

The LA tech community has been working on putting together various events such as Twiistup and Lunch 2.0.

Today, YellowBot is hosting Lunch 2.0.

Vani & Ask, after getting the idea from Erik and Chad, started a new blog to talk specifically about Los Angeles tech events.

I also created a public Google Calendar that you can subscribe to in order to track these events and have added all for them as publishers to that calendar:

1 Star2 Stars3 Stars4 Stars5 Stars (4 votes, average: 5 out of 5)
 
Loading ... Loading ...

October 2, 2007

Google Ends Immersive Media Relationship and Talks of Future

Immersive Media announced that their deal with Google for StreetView has terminated. Once can only speculate on who terminated and why or why Google decided not to acquire them. Google has had their fleet of StreetView cars spotted on their lot and IM says they’ve sent them out. They’ve been doing their own collecting while simultaneously using IM for a while now.

Why would Google build its own fleet to begin with? The press release mentions that their content licensing deal has ended. If Google was licensing the technology, they could own the content but the press release indicates they are licensing the content so IM would own those photos. Could this mean that the licensing terms were too restrictive? When I was a SES in San Jose this year, I went to the Google Dance on the Google Campus and the engineer working on the StreetView project (he was in charge of the pictures, not the programming) said they were planning to bring those photos into the API so people could use them for their mashups. I already have noticed that a lot of features and data that exist on their site are not in the API because of other licensing issues. Could this have been one of the issues? Or is it that Google wanted to ramp up quickly and do this in such a large volume that IM could not handle that sort of demand in a short period of time…so Google would build its own fleet and, once completed and large enough, they would not need IM anymore? Well, we won’t know for sure until someone says more.

Meanwhile, at SMX Local & Mobile, Michael Jones, Chief Technologist for Google Earth, Google Maps, and Google Local gave a keynote where he speaks a bit about the future direction of their products and the industry. Some take-aways include:

  • Google strives to be a local searcher’s concierge (as in a concierge that helps you at a hotel or elsewhere)
  • Theirs other data out there that must be mapped into local (the ones he mentions such as traffic, reviews, etc are no-brainers and they’re already happening…what is up Google’s sleeve??) and using all that info to “geographically organize the world’s information”
  • Google knows they have a lot of work to do to improve their product
  • Crowdsourcing is a way to collect info and improve data (perhaps a way to internalize the risk of external contracts)
1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 3 out of 5)
 
Loading ... Loading ...

May 31, 2007

Snap Names Acquired By Oversee.net

A number of developers (mostly Perl developers) I have worked with have been recruited by Oversee.net in downtown Los Angeles. Oversee.net predominantly started by registering domains (sometimes tasting them) and using them to make money off of clicks. They have since used some of their premium domains to host various sites and services, building them out with content and tools.

Jeff was one of the first employees. They have since exploded to hundreds of people and now they are expanding.

From an email I have received from Snap Names, a domain drop catching service:

To SnapNames Customers:

I’m writing to inform you that SnapNames has agreed to be acquired by Oversee.net. Oversee, a company already familiar to many in the domain name industry, is a technology-driven online marketing solutions company that offers an impressive array of services to domain name owners. You can learn more about the company at www.oversee.net.

It’s important that you understand there will be no changes to the way SnapNames provides its services. This is a combination of two industry leaders with outstanding reputations for serving domain name investors and customers at all levels.

We were attracted to Oversee for many reasons, including the opportunity to offer SnapNames customers a greater breadth of service offerings. Together, the two companies can provide services that support our customers’ needs throughout the entire life cycle of a domain name, including procurement, monetization and sales.

This transaction is expected to close in mid-June. There is more information available on our Web site at www.snapnames.com. Of course, we’re always available to assist you in any way we can, and encourage your questions and comments. Our support team is available to you here:

On the Web: http://snapnames.custhelp.com
By e-mail: support@snapnames.com

At SnapNames, you will continue to find the world’s largest selection of expired domain names. You’ll find no changes to your account or the way you do business with us. We value you as a customer and thank you for your continued business.

Sincerely,

Sudhir Bhagwan
Chief Executive Officer

Hopefully they don’t keep the best domains for themselves and continue allowing others to use their service to bid for some of those domains. :-)

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5 out of 5)
 
Loading ... Loading ...

May 30, 2007

Preview Google Mapplets

Google is allowing you to preview mapplets.

Here is how Google descibes mapplets:

Mapplets are mini-webpages that are served inside an IFrame within the Google Maps site. You can put anything inside this mini-webpage that you can put into a normal webpage, including HTML, Javascript, and Flash. Google provides a Javascript API that gives the Mapplet access to services such as manipulating the map, fetching remote content, and storing user preferences.

When a Mapplet is enabled by the user, Google’s servers will fetch the Mapplet source code from your web server, and then serve it to the user from gmodules.com. To reduce the load on your server, gmodules.com will cache your source code for several hours.

There are lots of mapplets already available or you can make your own. Some existing ones include statistical information, trends around an area, movie times, and real estate information for a given location.

Publish your own to get people utilizing your site/data.

1 Star2 Stars3 Stars4 Stars5 Stars (1 votes, average: 1 out of 5)
 
Loading ... Loading ...

May 24, 2007

Real Estate API for Home Values, Sales Data, etc

Yahoo is providing an impressive set of APIs for developers.

The Yahoo Real estate folks show us how they use a total of 5 APIs available to the public on their Home Values page.

Launched this morning, the new page combines three APIs available right here on the Developer Network with two more from Zillow, to provide a 360-degree view of what homes are worth in the neighborhood of your choice.

Yahoo! APIs In Use:

  • AJAX Maps - finds the home you’re searching for, recently sold comparables, and nearby similar homes for sale.
  • Local Search - finds and displays local appraisers and customer ratings.
  • Answers Question Search - finds and displays questions and answers for the query “home value.”

Zillow APIs In Use:

It’s a nice feature Yahoo has put together…but for developers out there who didn’t know these APIs were available (I know you’re out there…many of you send me emails with your questions), lets see what you can do with them! :-)

1 Star2 Stars3 Stars4 Stars5 Stars (2 votes, average: 5 out of 5)
 
Loading ... Loading ...

February 8, 2007

More APIs from Yahoo

Both Google and Yahoo have released interesting APIs based on XML feeds and the HTTP protocol.

Under a year ago (I think), Google launched the GData API uses Atom or RSS and will read/download, update, edit, or delete the data. APIs are available for many of their products that use RSS or Atom feeds such as Google Calendar. This will allow people to create their own applications that leverage some of their products.

However, the more interesting product seems to have just arrived come from Yahoo. They launched Yahoo Pipes (I was playing with it last night but this morning they have the following message: “Our Pipes are clogged! We’ve called the plumbers!”). Hopefully they get things fixed shortly so others will be able to check it out.

Here’s what Tim O’Reilly has to say about it:

Using the Pipes editor, you can fetch any data source via its RSS, Atom or other XML feed, extract the data you want, combine it with data from another source, apply various built-in filters (sort, unique (with the “ue” this time:-), count, truncate, union, join, as well as user-defined filters), and apply simple programming tools like for loops. In short, it’s a good start on the Unix shell for mashups.

For those of you who use unix, you know that pipes (|) allow you to send the output of one command to another so you can chain lots of logic together. I like the idea a lot and it has a nice interface, much of which will have a learning curve…but not as big as Google’s since it leverages a drag and drop interface. This may mean more applications from creative people that are not as familiar with programming…which is always a good thing because it good ideas and products tend to build on existing ones.

Pretty interesting stuff. I can’t wait to see what people do with it…I also wonder if people will do things like remove ads from feeds and aggregate feeds to create their own minisites they can repurpose to create splogs or other similar sites…or if the original content creators will take any sort of action to prevent this from happening (I wonder if Yahoo uses a different user-agent which can be specified in the robots.txt file, for example).

Yahoo employee, Jeremy weighs on Yahoo Pipes in as well as Google employee Matt Cutts.

For you developers building local applications, Niall Kennedy brought my attention to a location extractor:

My favorite operator is the location extractor which analyzes an item’s text attempting to identify addresses, locations, or the URLs of popular mapping services.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
 
Loading ... Loading ...

January 11, 2007

More traffic to your local/mashup site using Google

The official Google Maps API blog lets us know that you can Drive More Traffic to Your Maps API Site - Include KML Files in Your Sitemap.

If you have a site with maps on them, include a KML file somewhere on your site and use Google Sitemaps to expose them to googlebot (so it knows where to find them).

Including KML files in a sitemap.xml file (see http://www.sitemaps.org/protocol.html ) is a great way for you to help us index and drive traffic to your site. After publishing your data in KML, we’ll crawl the KML files that you specify in your sitemap.xml file. We’ll send users your way when they search for content that is found on your mashup site. As an added bonus, once your data is in KML, it will be available for viewing on Google Earth.

1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
 
Loading ... Loading ...
« Previous entries