Saifudin Badani


Reducing AI Cost by 50%.

I was paying AUD$33 each month to ChatGPT or Claude subscription, choosing each of them based on use-case or who has a better model that month. I had two problems with that, one I was paying more as I am using these models just for learning and second I don't use these models daily. So I had to figure out a way to reduce expenditure and have access to these models without the monthly subscription.

I came to know that you can get API access of these models they don’t advertise it actively obviously; but how do I use it? First I used the API in a PHPStorm extension, but that was designed for code completion and correction. I wanted a GUI like Claude and ChatGPT.

Open WebUI solved this. It has all the features of mainstream GUIs plus a lot more. You can spin up a docker container to host this. You don't necessarily need Ollama, but if you need want to use open source models like Llama and Mistral and have the computing power, you can download and use it with Open WebUI.

Now, how do you connect and start using ChatGpt and Claude? For ChatGpt you can follow this guide by DigitalBrainBase on YouTube. Open WebUI supports ChatGpt natively so you just have to add the api key and you will find all the models in chat. For Claude, there’s a feature in Open WebUI which is called Functions. They are like plugins, you can refer Claude plugin but use it at your own risk.