Skip to content
  • Categories
  • Recent
  • Tags
  • Popular
  • World
  • Users
  • Groups
Skins
  • Light
  • Brite
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Collapse
ekk

ekk

  1. Home
  2. Categories
  3. Comic Strips
  4. When You Build a Robot Smarter than People

When You Build a Robot Smarter than People

Scheduled Pinned Locked Moved Comic Strips
comicstrips
24 Posts 19 Posters 4 Views
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • B This user is from outside of this forum
    B This user is from outside of this forum
    [email protected]
    wrote last edited by
    #1
    This post did not contain any content.
    Link Preview Image
    J L paraphrand@lemmy.worldP D samus12345@sh.itjust.worksS 7 Replies Last reply
    389
    • B [email protected]
      This post did not contain any content.
      Link Preview Image
      J This user is from outside of this forum
      J This user is from outside of this forum
      [email protected]
      wrote last edited by
      #2

      The risk is worth it for bideo games

      1 Reply Last reply
      17
      • B [email protected]
        This post did not contain any content.
        Link Preview Image
        L This user is from outside of this forum
        L This user is from outside of this forum
        [email protected]
        wrote last edited by
        #3

        To me, there is no risk. Destroying the world is the goal. Humans had a very bad run on this planet. Destroy. Erase. Rebuild.

        K S 2 Replies Last reply
        7
        • B [email protected]
          This post did not contain any content.
          Link Preview Image
          paraphrand@lemmy.worldP This user is from outside of this forum
          paraphrand@lemmy.worldP This user is from outside of this forum
          [email protected]
          wrote last edited by
          #4

          SamanthA

          1 Reply Last reply
          1
          • L [email protected]

            To me, there is no risk. Destroying the world is the goal. Humans had a very bad run on this planet. Destroy. Erase. Rebuild.

            K This user is from outside of this forum
            K This user is from outside of this forum
            [email protected]
            wrote last edited by
            #5

            Cringe take

            1 Reply Last reply
            20
            • L [email protected]

              To me, there is no risk. Destroying the world is the goal. Humans had a very bad run on this planet. Destroy. Erase. Rebuild.

              S This user is from outside of this forum
              S This user is from outside of this forum
              [email protected]
              wrote last edited by
              #6

              Problem is, there probably isn't any rebuilding again or at least not to the level of technology that we are currently at. The main reasoning for that is all of the easy to get to resources like metals and fossil fuels have already been used up. So if this doesn't work out and another potentially intelligent species comes along it's going to be even harder than we had it getting things started.

              L 1 Reply Last reply
              7
              • B [email protected]
                This post did not contain any content.
                Link Preview Image
                D This user is from outside of this forum
                D This user is from outside of this forum
                [email protected]
                wrote last edited by
                #7

                It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs

                ...when LLM can't keep track of what's being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.

                T sxan@midwest.socialS T S 4 Replies Last reply
                31
                • S [email protected]

                  Problem is, there probably isn't any rebuilding again or at least not to the level of technology that we are currently at. The main reasoning for that is all of the easy to get to resources like metals and fossil fuels have already been used up. So if this doesn't work out and another potentially intelligent species comes along it's going to be even harder than we had it getting things started.

                  L This user is from outside of this forum
                  L This user is from outside of this forum
                  [email protected]
                  wrote last edited by
                  #8

                  .........I feel like you totally missed the point. Destroy all humans. Humans are now extinct. Erase their effects on the planet. And rebuild the ecosystem with species that are healthy for the planet.

                  Humans make the mistake of thinking that they are the most important thing in existence, and the world would end without them. This planet has survived countless exinctions of species in the past. It'll survive just as well without us.

                  carbonicedragon@pawb.socialC B samus12345@sh.itjust.worksS 3 Replies Last reply
                  2
                  • L [email protected]

                    .........I feel like you totally missed the point. Destroy all humans. Humans are now extinct. Erase their effects on the planet. And rebuild the ecosystem with species that are healthy for the planet.

                    Humans make the mistake of thinking that they are the most important thing in existence, and the world would end without them. This planet has survived countless exinctions of species in the past. It'll survive just as well without us.

                    carbonicedragon@pawb.socialC This user is from outside of this forum
                    carbonicedragon@pawb.socialC This user is from outside of this forum
                    [email protected]
                    wrote last edited by
                    #9

                    The planet would survive, sure, but why should we care that it does without us? Meaning is something that we invented, without someone to assign it, an ecosystem is little more valuable than a rock.

                    I 1 Reply Last reply
                    9
                    • D [email protected]

                      It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs

                      ...when LLM can't keep track of what's being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.

                      T This user is from outside of this forum
                      T This user is from outside of this forum
                      [email protected]
                      wrote last edited by
                      #10

                      Right but reliance on it is a way to destroy the world in the dumbest way. I don’t mean in the robot apocalypse way but the collapse of most societies. Without reliable information, nothing can get done. If shitty llms get put into everything, there's no government, no travel, no grid/infrastructure and logistics of every kind are gone.

                      While it’s fun to think about living in a small, self-sufficient community, we are not prepared for that and certainly not at this pace.

                      1 Reply Last reply
                      8
                      • carbonicedragon@pawb.socialC [email protected]

                        The planet would survive, sure, but why should we care that it does without us? Meaning is something that we invented, without someone to assign it, an ecosystem is little more valuable than a rock.

                        I This user is from outside of this forum
                        I This user is from outside of this forum
                        [email protected]
                        wrote last edited by
                        #11

                        animals experience meaning. they don't like to die.

                        1 Reply Last reply
                        2
                        • D [email protected]

                          It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs

                          ...when LLM can't keep track of what's being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.

                          sxan@midwest.socialS This user is from outside of this forum
                          sxan@midwest.socialS This user is from outside of this forum
                          [email protected]
                          wrote last edited by
                          #12

                          Maybe that's the risk. That we design it to be benevolent, but it destroys us through sheer stupidity.

                          It's one way to get monkey paw wishes. "AI, solve climate change!" "Ok! Eliminating all humans now!"

                          1 Reply Last reply
                          2
                          • D [email protected]

                            It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs

                            ...when LLM can't keep track of what's being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.

                            T This user is from outside of this forum
                            T This user is from outside of this forum
                            [email protected]
                            wrote last edited by
                            #13

                            The risk of LLMs aren't on what it might do. It is not smart enough to find ways to harm us. The risk seems from what stupid people will let it do.

                            If you put bunch of nuclear buttons in front of a child/monkey/dog whatever, then it can destroy the world. That seems to be what's LLM problem is heading towards. People are using it to do things that it can't, and trusting it because AI has been hyped so much throughout our past.

                            B 1 Reply Last reply
                            4
                            • L [email protected]

                              .........I feel like you totally missed the point. Destroy all humans. Humans are now extinct. Erase their effects on the planet. And rebuild the ecosystem with species that are healthy for the planet.

                              Humans make the mistake of thinking that they are the most important thing in existence, and the world would end without them. This planet has survived countless exinctions of species in the past. It'll survive just as well without us.

                              B This user is from outside of this forum
                              B This user is from outside of this forum
                              [email protected]
                              wrote last edited by
                              #14

                              Erase their effects on the planet. And rebuild the ecosystem with species that are healthy for the planet.

                              Coal ain't coming back. The Carboniferous isn't going to happen again because the mycological consciousness knows how to deal with lignin already.

                              Replacing the oil I know less about, but it would take millions of years to replace what we've burnt/processed if it was produced at historical rates.

                              1 Reply Last reply
                              3
                              • L [email protected]

                                .........I feel like you totally missed the point. Destroy all humans. Humans are now extinct. Erase their effects on the planet. And rebuild the ecosystem with species that are healthy for the planet.

                                Humans make the mistake of thinking that they are the most important thing in existence, and the world would end without them. This planet has survived countless exinctions of species in the past. It'll survive just as well without us.

                                samus12345@sh.itjust.worksS This user is from outside of this forum
                                samus12345@sh.itjust.worksS This user is from outside of this forum
                                [email protected]
                                wrote last edited by
                                #15

                                "I like how you think, monkey. Prepare to be destroyed!"

                                1 Reply Last reply
                                0
                                • B [email protected]
                                  This post did not contain any content.
                                  Link Preview Image
                                  samus12345@sh.itjust.worksS This user is from outside of this forum
                                  samus12345@sh.itjust.worksS This user is from outside of this forum
                                  [email protected]
                                  wrote last edited by
                                  #16

                                  1 Reply Last reply
                                  18
                                  • T [email protected]

                                    The risk of LLMs aren't on what it might do. It is not smart enough to find ways to harm us. The risk seems from what stupid people will let it do.

                                    If you put bunch of nuclear buttons in front of a child/monkey/dog whatever, then it can destroy the world. That seems to be what's LLM problem is heading towards. People are using it to do things that it can't, and trusting it because AI has been hyped so much throughout our past.

                                    B This user is from outside of this forum
                                    B This user is from outside of this forum
                                    [email protected]
                                    wrote last edited by
                                    #17

                                    LLMs are already deleting whole production databases because "stupid" people are convinced they can vibe code everything.

                                    Even programmers I (used to) respect are getting convinced LLM are "essential". 😞

                                    I A 2 Replies Last reply
                                    2
                                    • D [email protected]

                                      It is funny watching people claim AGI is just around the corner so we need to be safe with LLMs

                                      ...when LLM can't keep track of what's being talked about, and their main risks are: Covering the internet with slop and propaganda, and contributing to claime change. Both of which are more about how we use LLMs.

                                      S This user is from outside of this forum
                                      S This user is from outside of this forum
                                      [email protected]
                                      wrote last edited by
                                      #18

                                      The difference between LLMs and human intelligence is stark.
                                      But the difference between LLMs and other forms of computer intelligence is stark too (eg LLMs can’t do fairly basic maths, whereas computers have always been super intelligences in the calculator domain). It’s reasonable to assume that someone will figure out how to make an LLM that can integrate better with the rest of the computer sooner rather than later, and we don’t really know what that’ll look like. And that requires few new capabilities.

                                      The reality is we don’t know how many steps between now and when we get AGI, some people before the big llm hype were insisting quality language processing was the key missing feature, now that looks a little naive, but we still don’t know exactly what’s missing.
                                      So better to plan ahead and maybe arrive early at solutions than wait until AGI has arrived and done something irreversible to start planning for it.

                                      1 Reply Last reply
                                      2
                                      • B [email protected]
                                        This post did not contain any content.
                                        Link Preview Image
                                        P This user is from outside of this forum
                                        P This user is from outside of this forum
                                        [email protected]
                                        wrote last edited by
                                        #19

                                        Do you remember when we were all wanting to be careful with AI, and not just proliferate the thing beyond any control?

                                        It was only a few years ago, but pepperidge farm remembers

                                        1 Reply Last reply
                                        2
                                        • B [email protected]

                                          LLMs are already deleting whole production databases because "stupid" people are convinced they can vibe code everything.

                                          Even programmers I (used to) respect are getting convinced LLM are "essential". 😞

                                          I This user is from outside of this forum
                                          I This user is from outside of this forum
                                          [email protected]
                                          wrote last edited by
                                          #20

                                          They are useful to replace stackoverflow searches.

                                          B 1 Reply Last reply
                                          1
                                          Reply
                                          • Reply as topic
                                          Log in to reply
                                          • Oldest to Newest
                                          • Newest to Oldest
                                          • Most Votes


                                          • Login

                                          • Don't have an account? Register

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          • First post
                                            Last post
                                          0
                                          • Categories
                                          • Recent
                                          • Tags
                                          • Popular
                                          • World
                                          • Users
                                          • Groups