Looking to hear your guys’ thoughts on this, and hopefully share points in a more sophisticated manner than I can describe. (also, I hope this is an appropriate place to post?)

I have ran into this discussion a few times across the fediverse, but I can’t for the life of my find those threads and comments lol

I believe that a non-corporate owned platform with user-generated information is most optimal, like wikipedia. I don’t know the technicalities, but I feel like AI can’t replace answers from human experiences - humans who are enthusiasts and care about helping each other and not making money

I don’t know much about this topic, but I’m curious if you guys have actual real answers! Thread-based services like this and stack overflow (?) vs chatgpt vs bing vs google, etc.

  • ribboo@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I think people are way to quick to dismiss AI on the basis that it’s not always factual. Searching for stuff and adding Reddit is a great way to get non factual information as well. Everyone that has great insight into a subject knows how horrible many highly upvotes comments are.

    Wether you use AI, Reddit or Google, you have to do a quick analysis of how credible it seems. I use all three of them, but more and more AI for niche searches that are hard to get good results for.

  • Shambling Shapes@lemmy.one
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I mostly have experience with Bing. And it’s because they keep forcing their shitty AI search splash page on me every time I want to do a normal web search. I turned it off in the Edge browser but what do you know, it keeps coming back.

    Any new feature a company repeatedly forces on me is going to be starting from a hole it has to dig out of. The bigger the corporation, the more immediately resistant I will be to it. “ChatGPT” and “AI” as the latest buzzphrases grate on me.

    Outside the big corporations, I’m keen to tinker around with it some. I’ve done some machine learning stuff in years past, but this a large step change in what is available to hobbyists.

  • variouslegumes@reddthat.com
    link
    fedilink
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’ve been trying out an IaC services’ (Pulumi) chatbot to answer questions about how to spin up architecture. It’s really bad. Totally makes up properties that don’t exist and at times spins up code that doesn’t even make sense syntactically. Not to mention that the code it generates has the potential to cost not insignificant amounts of money.

    Definitely not a replacement for stack overflow, github, forums, or random blog posts. Not for a service that spins up critical infrastructure. Like, you have to know to some degree how that stuff works. And if you know how that stuff works, what’s the point of the service? Saving a few minutes typing stuff out and looking at documentation?

  • arthur
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    Without solving the “hallucination problem”, is very risk for let that became mainstream. And also, it’s extremely expensive to do that, since running LLM prompts costs more energy comparing with simple searches. RN, running smaller models locally seems more interesting.

    Also, is seems more useful not as chat, but as voice assistants.

    https://www.nytimes.com/2011/09/09/technology/google-details-and-defends-its-use-of-electricity.html#:~:text=Google also released an estimate,be difficult to understand intuitively.

    https://towardsdatascience.com/chatgpts-electricity-consumption-7873483feac4#:~:text=ChatGPT’s electricity consumption per query is the same as BLOOM’s,of BLOOM’s%2C i.e. 0.00297 KWh

  • Kir@feddit.it
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    To generate answers is not to search answers. If I need a search engine, I want a search engine. If I need a text generation model I want a text generation model.

  • Julian@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    The worst part about ai as a search engine is that it doesn’t (or at least can’t reliably) give you the original source. It can tell you lots of stuff but there’s no link to a news article or wiki page where it got it from. A traditional search engine can give you unreliable results, but at least you can look at them yourself and decide if they’re reliable or not. An AI search engine has you just take what it says at face value, true or not.

  • Dizzy Devil Ducky@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    I personally do not like the idea of AI powered “search” engines since AI has been known in the past to absolutely make stuff up and site fake articles that don’t actually exist.

    I don’t remember the exact article, but I do remember the story of either a lawyer or law professor (I can’t remember which) who asked an AI chatbot about himself and it came up citing a fake news article about him having sexual relations with a student of his (if I am remembering this all correctly).

    Also, I prefer a traditional search where I am given a ton of varying links to different web pages displayed in a listed order so that way I can open a link and if I don’t find what I’m looking for, just close said link and try another one. Compare that to any time I’ve used Perplexity chatbot where at most at the end of each response I’m given a few different links that may or may not contain the answer I’m looking for if they’re even legitimate.

  • I’m with you on this one. Personally, there are a myriad of issues with replacing search engines with AI-generated answers:

    1. the accuracy. Without going into what is truth or falsehood, can you trust AI generated answers? I use Brave Search occasionally, and it has an AI summary text at the top. A lot of the time it strings multiple conflicting answers together into a paragraph and the result is laughably bad.

    When I look something up that isn’t trivial, I typically use multiple search results and make the call myself. This step is removed if you use AI, unless one explicitly ask it to iterate all the top conflicting answers (along with sources) so the user can decide for themselves. However, as far as I know, its amalgamated answer is being treated as a source of truth, even if the content has nuanced conflicts a human can easily spot. This alone deters me from AI search in general.

    1. I feel like doing this will degenerate my reading/skimming comprehension and research skills, and can lead to blindly trusting direct and easy to access answers.

    2. In the context of technical searches like programming or whatnot, I’m not that pressed for time to take shortcuts. I don’t mind working stuff out from online forums and documentation, purely because I enjoy it and it’s part of the process.

    3. Sometimes, looking things up yourself means you also can discover great blogs and personal wikis from niche communities, and related content that you can save and look back later.

    4. Centralizing information makes the internet bland, boring and potentially exploitative. If it becomes normalized to pay a visit to one or two Big AI search engines instead of actually clicking on human-made sources then the information-providing part of the internet will become lost to time.

    There’s also problems with biases, alignment, training AI on AI-generated content, etc., make of that what you will but that sounds worse than spending a couple of minutes selecting sources for yourself. Top results are already full of generic, AI generated stuff. The internet, made by us, for us, must prevail.

    Anecdotally, I’ve used ChatGPT once or twice when I was really pressed for time with something I couldn’t find anywhere, and because my university professor wasn’t replying to my email regarding the topic. I was somewhat impressed at its performance, but this was after 6 or 7 prompts, not a single search away.

    Maybe the next generation of AI search users who’s never looked a thing up manually will grimace at the thought of pre-AI search engines.

  • Fizz@lemmy.nz
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    1 year ago

    I definitely think ai search engines are the next step. The way most people use Google is already a human readable prompt which gpt handles very well. We just need to improve the results and figure out a way for it to not steal and suppress the views from the websites.