1. Distributed & Efficient & Customizable Web crawler.
The crawler was development with ruby. It was consitituted of 4 modules: task manager, seed generator, crawler, parser. Considering web crawler is quite some hard jobs, so it support distributed, which means crawler and parser can be deploy to unlimited amount of server.
##Usage of the crawler:
- Data from google(with user customed query)
- Some sites in china similar with Craigslist
- Stock-related site for news and comments
2. Web development with Ruby on Rails.