Topic modelling and classifications are real problems in LLM observability and evaluation, glad to see a platform doing this.
I see that you have chained prompts, does that mean I can define agents and functions inside the platform without having it in the code?
Yes! Our pipeline builder is pretty versatile. You can define conditional routing, parallel branches, and cycles. Right now we support LLM node and util nodes (json extractor). If you can defined your logic purely from those nodes (and in majority of cases you will be), then great, you can host everything on Laminar! You follow this guide (https://docs.lmnr.ai/tutorials/control-flow-with-LLM) it's bit outdated by gives you a good idea on how to create and run pipelines.