this post was submitted on 22 Feb 2024
487 points (96.2% liked)

Technology

58061 readers
31 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 2 points 7 months ago (1 children)

How powerful is ollama compared to say GPT-4?

I’ve heard GPT-4 uses an enormous amount of energy to answer each prompt. Are the models runnable on personal equipment once they’re trained?

I’d love to have an uncensored AI

[–] [email protected] 1 points 6 months ago

Llama2 is pretty good but there are a ton of different models which have different pros and cons, you can see some of them here: https://ollama.com/library . However I would say that as a whole models are generally slightly less polished compared to chat gpt.

To put it another way: when things are good they’re just as good, but when things are bad the AI will start going off the rails, for instance holding both sides on the conversation, refusing to answer, just saying goodbye, etc. More “wild westy” but you can also save the chats and go back to them so there are ways to mitigate, and things are only getting better.