Showing posts with label chatterbot. Show all posts
Showing posts with label chatterbot. Show all posts

06 November 2017

What are chatbots? And what is the chatbot community?

In the beginning, all bots on IRC (Internet Relay Chat) were popularly referred to as “chat bots”.  Basically, IRC was the predecessor of IM (Instant Messaging) for realtime chat.  And Facebook Messenger is basically the successor of IM.  

After years of IM services fighting bots and automation, in a surprise move Facebook opened Messenger to bots in April 2016, which I call the “Facebook April surprise”.  Immediately, people began referring to Facebook Messenger bots as “chat bots” (note space).  Until then, the term chatbots (no space) had been gradually taking over the space previously known as chatterbots.

Since the Facebook April surprise, there has been a grand confusion reigning with people talking at cross-purposes about chatbots, challenging expectations all around.  Basically, Facebook Messenger chatbots have become “chat apps”, with lots of graphical UI elements, such as cards, interspersed with natural language.

Prior to the Facebook April surprise, there has long been a robust chatterbot community largely gathered around the controversial Loebner Prize.  Until today, the Loebner Prize has been the only significant implementation of the Turing test in popular use.  I happen to believe that the Turing test itself is problematic, if not a red herring; however, the contest’s founder Hugh Loebner deserves a place in history for stimulating the art, especially through the so-called AI winter.

There are further stakeholders in this melee.  In addition to the academic community of artificial intelligence researchers, there is also the natural language processing community.  Some people count NLP as a subset of AI, though a good argument can also be made against that.  My long investigation into NLP has shown to me that natural language processing has been largely predicated on the analysis, or deconstruction, of natural languages, for instance in machine translation, leading to natural language understanding.  It is only relatively recently that an emphasis has been placed on the construction of natural language, generally referred to as natural language generation.

Artificial intelligence itself is not a very useful term, as it implies replicating, or copying, human intelligence, which carries its own set of baggage.  As used today, it is so broad as to be ineffectual.  In short, AI researchers are not necessarily chatterbot, or dialog system, researchers, and nor are NLP researchers.  There are various and sundry loci for high level discourse on dialog systems for the academic community, often with large corporations hanging around the periphery.

There used to be a very good, informal mailing list for the Loebner Prize crew, but it suddenly got deleted in a fit of passion.  From there, the chatterbot community more or less came in from the cold of perhaps a dozen separate web forums gradually to the chatbots.org “AI Zone” forum, largely dedicated to the art of hand-crafting so-called pattern-matching dialog systems, or chatterbots.  

Hot on the heels of the Facebook April surprise, an enterprising young man named Matt Schlicht opened the Facebook Bots (chatbot) group, which had close to 18,000 members six months later (and close to 30,000 members today, 18 months later).  I would say throughout that process it has provided an informative and dynamic timeline, around which a new community has rallied.  However, that same community has collectively come to realize that Facebook Messenger “chat apps” are not the chatterbots everyone has been dreaming about.

Matt Schlicht had a proverbial tiger by the tail, in the form of his Facebook Bots group.  Due to the public pressure of a scandal of his own making, he initiated the process of electing a moderation team in November 2016.  I know how difficult this can be, managing online communities, through my own experience with the once popular travel mailing list, infotec-travel, throughout the dot-com bubble of the 1990s, an online community which ultimately lead to much of the online travel infrastructure we enjoy today.

Not only have I been banned from Matt Schlicht’s Facebook Bots group, but have been banned twice, and am still banned today.  The first time I was banned after posting about my chatbot consulting services.  However, due to the gracious intercession of current Loebner Prize holder, Mitsuku developer Steve Worswick, my group membership was reinstated.  I was then banned for the second time after sharing a private offer from Tinman Systems for their high end artificial intelligence middleware.  

After being ejected from the Facebook Bots group for the second time, I started my own Facebook group at Chatbot Developers Bangalore, due to my particular interest in AI, NLP, and chatbots in India.  (I also happen to be co-organizer of the Bangalore Robotics Meetup.)  Today, I blog about this a year later, because one of those new Facebook Bots group admins stirred the controversy by requesting admission to a closed Facebook group of which I'm a member, Australian Chatbot Community.


This blog post was originally submitted to VentureBeat in November 2016, prior to the successful election of a new administration team for the Facebook Bots group.

30 December 2007

AIML <-> OWL ??

Since I posted my original query to the pandorabots-general list in July, I'm beginning to understand the concepts involved a little better, thanks also to replies from this group and others, such as the protege-owl list.

In a comment to my recent blog entry ("I'm dreaming of RSS in => AIML out"), Jean-Claude Morand has mentioned that RSS 1.0 would probably be more conducive to conversion into RDF or AIML than RSS 2.0. He also mentioned that the Dublin Core metadata standard may eventually overtake RSS in primacy....

So, can XSL transforms really be used to translate between RSS and RDF, and between RDF and AIML?? My understanding at this point is that talking about AIML and OWL is a bit like apples and oranges.... Apparently the output from an OWL Reasoner would be in RDF? I have by now discovered the Robitron group and am finding that archive to be a rich resource....

What does this have to do with Pandorabots? I would like to address a brief question, in particular to Dr. Wallace... what do you see as the impediments to upgrading the Pandorabots service to include an OWL Reasoner (or in chaining it to another service that would provide the same function)? Surely you've considered this.... Where are the bottlenecks (other than time and money of course)? Is it an unreasonable expectation to be able to upload OWL ontologies much the same as we can upload AIML knowledgebases today?

As I have mentioned previously, one of my interests is creating knowledgebases on the fly using taxonomies. My belief is that quick and dirty knowledgebases are a more productive focus than pouring time and energy into trying to meet the requirements of the Turing test (another rant for another day....) Certainly with chatbots there is a substantial element of smoke and mirrors involved in any case.... One can always go back and refine as needed based on actual chat logs.

The next step for me will be to try and convert my most recent book, VAGABOND GLOBETROTTING 3, into a VagaBot.... I would like to hear from anyone with experience in converting books into AIML knowledgebases! My supposition is that a *good* book index is in effect a "taxonomy" of that book.... My guess is that I can use the index entries as patterns, and their referring sections as templates... to create at least the core of a knowledgebase. If more detail is needed then a concordance can always be applied to the book.

After that I hope to tackle creating quick and dirty AIML knowledgebases on the fly from RSS feed title and description fields... not in pursuit of the chimera of the Turing test, but simply to build a better bot. (Now, I wonder if anyone has ever created RSS from a book?!? ;^))

22 December 2007

I'm dreaming of RSS in => AIML out

I am still trying to get my head around the relationship between chatbots and the Semantic Web, or Web 3.0.... Any thoughts or comments on the precise nature of this relationship are welcome.

Converting from VKB back into AIML was my first crash course in working with XML dialects.... Since then the old lightbulb has gone off, or rather "on" I should say, and it suddenly dawned on me that the whole hullabaloo about Web 2.0 largely centers on the exchange of metadata, most often in the form of RSS, another XML dialect.

I was really stoked to learn of the work of Eric Freese, apparently processing logic using the Jena framework then manually(?) converting that RDF into AIML; however, I continue to wait for word of his "Semetag/AIMEE" example at http://www.semetag.com .

My understanding is that it is quite do-able, as in off the shelf, to pull RSS into a database and accumulate it there.... Could such a database of RSS not be used as a potential knowledgebase for a chatbot?

The missing element seems to be the processing, or DL Reasoner(?).... I have been unable to find any reference to such a web-based, modular DL Reasoner yet....

http://www.knoodl.com seems to be the closest thing to a "Web 2.0-style" collaborative ontology editor, which is fine for creating ontologies collectively, however falls short of meeting the processing requirement.

In short, I'm dreaming of RSS in => AIML out. At this point I would be happy with a "toy" or abbreviated system just to begin playing around with all this affordably (not least time-wise). So it seems what's still needed is a simple, plug and play "Web 2.0-style" (or is that "Web 3.0" style?) web-based DL Reasoner that accepts common OWL ontologies, then automagically goes from RDF into AIML....

31 December 2006

ByronBot - Byron Bay, Australia

Lately I've been working with chatterbot technology, a computer program designed to simulate an intelligent conversation with one or more human users via auditory or textual methods, using AIML or Artificial Intelligence Markup Language, an XML dialect for creating natural language software agents.

My latest creation is ByronBot (http://www.mendicott.com/byronbay/), which you can ask questions about Byron Bay and the Rainbow Region of northern New South Wales, Australia - where I now live.

Previously, I created the meta-guide geobot (http://www.meta-guide.com/), which knows continents, regions, all countries, all of their capitals, and can provide more such as maps, travel books and cool travel videos - as well as country-specific information about ecotourism and sustainable tourism.

One associate has compared ByronBot with Microsoft's Ms. Dewey (http://www.msdewey.com/), an Adobe Flash-based experimental interface for Windows Live Search.