[long time reader delurking.]
The language was something of a mess, with features taken from all over the place. If I recall correctly, it looked most like Visual Basic, but that may be wrong. It was usable, however.
What killed it? My guess is that it fell prey to the difficulty of scraping content in general, which has led me to two observations:
* Almost all Web "active proxy" systems have failed. Transcoders, scrapers, Web scripting languages, etc., regardless of the specific capabilities, have been almost completely unsuccessful. Systems rich enough to scrape content from an uncooperative web site tend to be expensive and will not be of broad interest.
* If you could program well enough to scrape content from a complicated site, you'd probably prefer a real language with the associated debugging and support tools. Or Perl.
A way out of this, I think, would be for a vendor to sell relatively inexpensive tools that generate "real" language output, either for direct use, or for use as a "scaffold" when writing the full application. Going straight from tool to application doesn't seem to work well.
The other alternative would be to sell a really good library (Java or .NET) that encapsulates the nasty bits of scraping content or automating the Web, while allowing the developer to stay in her or his favorite environment and language.
[SiteRaider is a good example of the "cheap tool" approach. I haven't found the ultimate wrapper/scraper library for Java.]
|