Components of a full stack bot application


#1

This thread is to discuss the various components which in total make up a bot and share - if available - good practices about this full stack,

As far as I understand, for a full-stack-bot integrated with RC, there are basically three layers

  • The bot’s hands & ears: It can receive and send messages
  • The bot’s brain: It understands natural language, intents, tokens playing a role in them - and knows how to ask the user for relevant information
  • The bot’s tools: In order to do more than just talking, a bot may need to interact with the machinery surrounding it. In our case, this is Rocket.Chat. If the bot wants to e. g. query users or send non-message-information to the user on the other end, it needs those tools.

From what I understood so far,

  • the SDK provides the tools.
  • For the hands & ears, there are frameworks like hubot, botkit or just plain imperative code. Some tools may facilitate the hands’ work.
  • For the brain, there is also a huge variety, some bot frameworks also provide NLU. But the big players imho are dialogflow, watson and rasa.

Does that match your perception?

For us, we’re looking into a stack which - most important - does not include any SaaS (at runtime) but which can be fully self-hosted. Thus, we’re looking into Botkit + Rasa, package this with the SDK into an image and run it in a k8s-cluster.

Has anyone of you had experience with this stack or even provided artifacts?

I’d be most eager to reading from you!

Cheers,
Oliver