• 0 Posts
  • 13 Comments
Joined 3 years ago
cake
Cake day: July 7th, 2023

help-circle
  • Anything can be backdoor… In, but I’m really struggling to see how you could do something useful with a dram chip. In theory, if it were smart enough, it could analyze the data that’s being stored and manipulated in some way, but there’s no way a dram module would have the processing power and brains to do anything useful with this.

    And memory manipulation would be about the most it could accomplish because the dram modules themselves don’t have signal lines that can control anything. They basically have data alliance address lines, return lines, power ground and control circuitry. They can’t affect the rest of the motherboard/ computer other than subverting data… And computers tend to be pretty good at catching memory that doesn’t store data properly.

    If you tried hard enough, you could figure out a scenario where this could work, but I don’t think this is something we really need to worry about.



  • I was thinking the same thing, but if the goal is to get from point a to point b then the real question is what gets you there the safest.

    For example, if you wanted to know what the safest way to get from Los Angeles to San Francisco was or what the relative danger of each travel method was, this would be the right way to frame the data. The fact that it takes longer to travel with a car than a plane doesn’t factor into the safety of the travel. You still go the same distance.




  • “digging thru trash and bunch of obscure websites for info, using critical thinking to filter and refine your results”

    You’re highlighting a barrier to learning that in and of itself has no value. It’s like arguing that kids today should learn cursive because you had to and it exercises the brain! Don’t fool yourself into thinking that just because you did something one way that it’s the best way. The goal is to learn and find solutions to problems. Whatever tool allows you to get there the easiest is the best one.

    Learning through textbooks and one way absorption of information is not an efficient way to learn. Having the ability to ask questions and challenge a teacher (in this case the AI), is a far superior way to learn IMHO.


  • The thing is… AI is making me smarter! I use AI as a learning tool. The absolute best thing about AI is the ability to follow up questions with additional questions and get a better understanding of a subject. I use it to ask about technical topics and flush out a better understanding that I ever got from just a text book. I have seem some instances of hallucinating in the past, but with the current generation of AI I’ve had very good results and consider it an excellent tool for learning.

    For reference I’m an engineer with over 25 years of experience and I am considered an expert in my field.


  • I think this all has to do with how you are going to compare and pick a winner in intelligence. the traditional way is usually with questions which llms tend to do quite well at. they have the tendency to hallucinate, but the amount they hallucinate is less than the amount they don’t know in my experience.

    The issue is really all about how you measure intelligence. Is it a word problem? A knowledge problem? A logic problem?.. And then the issue is, can the average person get your question correct? A big part of my statement here is at the average person is not very capable of answering those types of questions.

    In this day and age of alternate facts and vaccine denial, science denial, and other ways that your average person may try to be intentionally stupid… I put my money on an llm winning the intelligence competition versus the average person. In most cases I think the llm would beat me in 90% of the topics.

    So, the question to you, is how do you create this competition? What are the questions you’re going to ask that the average person’s going to get right and the llm will get wrong?


  • blady_blah@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    11 months ago

    I asked gemini and ChatGPT (the free one) and they both got it right. How many people do you think would get that right if you didn’t write it down in front of them? If Copilot gets it wrong, as per eletes’ post, then the AI success rate is 66%. Ask your average person walking down the street and I don’t think you would do any better. Plus there are a million questions that the LLMs would vastly out perform your average human.