Gramps mentioned a few days ago about Perplexity’s new LLM online models with real-time information.
We’ve just added the Perplexity: PPLX 70B Online to our models, and I’ve tested it a bit. While it’s not as powerful as Claude v2.1 or GPT-4-turbo, it has lots of potential given its affordability (1 coin per 100 words) and real-time access to information. 💡
Here’s an example of how having real-time access to information can be a game-changer:
https://platform.straico.com/share/chat/656bca452dd0a9e83e8f3759
PS: this is the 70B version, I can upload the 7B version too if someone needs it.
Feel free to explore and let us know your thoughts! 🚀