Using Generative AI in Upmost
Dec 4, 2023
In our November 2023 newsletter we mentioned our initial thoughts on how Generative AI could be used in Upmost to help aid list creation for users. As a reminder Upmost is an innovative social platform designed to deliver brilliant list-based recommendations with a particular focus on music, entertainment and fashion.
Jake one of our developers has been working on this over last month and we share his thoughts here on his research in this area.
Upmost: The recommendation social network
The problem
One area we have always been working hard on in the product is to reduce the friction when users are creating new list content in the app. We have been exploring using recent advances in Large Language Model tools to help solve this problem.
Initially we are tackling a smaller use case of generating description text for lists.
The plan
We have experimented with utilising tools like ChatGPT and AWS Bedrock to act as an 'AI assistant' to generate an initial draft for the list description based on the list tile & category. The user can then use this as a starting point and make any edits they wish.
By integrating these advanced AI technologies, we aim to streamline content generation, ensuring consistency and creativity across the platform, while also saving time for our users. This integration not only improves user experience but also significantly enhances the overall content quality on Upmost.
The solution
Here we look at a comparison of Chat GPT (Model GPT-4-0613) and AWS Bedrock (Model Cohere Command) to take a prompt including a list title and list category to generate a description for a list.
Prompt used:
Comparison
Bedrock developer notes
The AWS API reference was a bit unclear as it doesn’t have explicit working examples, nor does it clearly explain the differences between each model and the various parameters.
I spent a little while trying to work out why the body was being refused, but it was the fact the body content wasn’t stringified
Input was shown as needing to be:
When initially creating the body and sending a blob buffer, it was being rejected. At this point I had to find a Github repository that had a simple example for each Bedrock model to get it to work.
Bedrock Cohere allows for you to change the “temperature” of the output response. This will change the randomness for each response. The higher the value the more variations there will be between each different call/response.
Example call:
Bedrock is set up to return the output as a stream as it will only ever send back a buffer, never a string. It may only be an illusion though as it just returns piece by piece, likely as it is generated.
Having to handle a buffer being returned isn’t a large issue but it is a slightly tedious thing to have to worry about handling when it’s not needed.
There also doesn’t appear to be a way for continued chats. It seems to simply be using only the prompt that is sent. If there is a method for doing this, then it isn’t shown anywhere in documentation.
Example responses:
A common problem that I came across with generating results is that they often either referenced the prompt or would supply a further question. Whilst this could potentially be improved with better prompt engineering, it seems something a bit unnecessary for a service that doesn’t seem to support continued chat too well.
ChatGPT developer notes
ChatGPT is much quicker and easier to get an initial setup going, requiring less code and the documentation is far more readable and extensive. I had no issue importing the project, sending a string prompt and getting a response.
Example call:
I found it interesting being able to send messages with different roles, which can add additional context to the prompt. Not just the message prompt that is sent from the user. Allowing for different tools and assistants to also be added into the context of the prompt. Whilst not useful for a single prompt/response, it provides far more power to conversational GPTs.
Rate limits really restrict scaling for use of the GPT-4 models
Text quality of response is superior to Bedrock Cohere.
Much nicer having a json/text response instead of having to deal with buffers/blobs
Example responses:
A quick note - often the responses seem a bit over the top and almost “trying” too hard to seem “superior” or “intelligent”. For the context of creating descriptions it just doesn’t really seem too human a response.
Conclusions
Chat GPTs API is much easier to work with and a bit more powerful, with the trade-off of speed.
Both chats feel a bit robotic still but potentially something that can be improved with prompt engineering
Bedrock (Cohere) seemed to be acting more as a chat than singular prompts, which causes a bit of a headache when trying to just retrieve a single response and nothing more
GPT-4s rate limits mean that it’s currently not really scalable for a large use application. This will almost certainly improve over time.
In their current states, if you want speed and price go for bedrock if you want quality of response go for GPT-4
We plan to expand the use of AI tools in Upmost in the future, where it makes sense to do so. We see this mainly revolving around improving the efficiency of creating lists. But also image generation and content moderation (which we are already doing partly with AWS Rekognition.