Despite the rush to integrate powerful new models, about 5% of AI pilot programs achieve rapid revenue acceleration; the vast majority stall, delivering little to no measurable impact on P&L.

The research—based on 150 interviews with leaders, a survey of 350 employees, and an analysis of 300 public AI deployments—paints a clear divide between success stories and stalled projects.

        • leisesprecher@feddit.org
          link
          fedilink
          arrow-up
          0
          ·
          2 days ago

          No, it does not.

          A deterministic, narrow algorithm that solves exactly one problem is not an AI. Otherwise Pythagoras would count as AI, or any other mathematical formula for that matter.

          Intelligence, even in terms of AI, means being able to solve new problems. An autopilot can’t do anything else than piloting a specific aircraft - and that’s a good thing.

          • wheezy@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            2 days ago

            Not sure why you’re getting downvoted. Well, I guess I do. AI marketing has ruined the meaning of the word to the extent that an if statement is “AI”.

            • fmstrat@lemmy.nowsci.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              1 day ago

              Because they are wrong. Airplane Autopilot is not “one model”, it’s a complex set of systems that take actions based on a trained model. The training of that model used standard ML practices. Sure, it’s a base algorithm, but it follows the same principles. That’s textbook AI.

              No one would have debated this pre-LLM. That being said, if I was in the industry, I’d be calling it an algorithm instead of AI, because those out of the know, well, won’t get it.

          • MrLLM@ani.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            2 days ago

            Intelligence, even in terms of AI, means being able to solve new problems.

            I’d argue that an artificial intelligence is (usually computational) a system that can mimic an specific behavior that we consider intelligent, deterministic or not, like playing chess, writing text, piloting an aircraft, etc.

            • leisesprecher@feddit.org
              link
              fedilink
              arrow-up
              0
              ·
              2 days ago

              And you’d argue wrong here, that is simply not the definition of intelligence.

              Extend your logic a bit. Playing an instrument requires intelligence. Is a drum computer intelligent? A mechanical music box?

              Yes, the definition of intelligence is vague, but that doesn’t mean you can extend it indefinitely.

              • MrLLM@ani.social
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 days ago

                I wanna point out three things:

                1. How can you tell someone is wrong when you have no idea?
                2. I think you missed the point, I said artificial intelligence, not intelligence as a whole.
                3. Yes, playing an instrument in a way that makes sense requires certain degree of intelligence, the music box inherently is not intelligent, but intelligence was required to build it.
                • leisesprecher@feddit.org
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  2 days ago
                  1. That’s a weak argument without substance. “No, you!” is not exactly a good counter.

                  2. Yes, that’s exactly what I’m talking about, which refutes your argument in 1).

                  3. That’s a whole different discussion. That intelligence is required to build something has nothing to do with whether the product is intelligent. The fact that you manage to mangle that up so bad is almost worrying.

              • Pennomi@lemmy.world
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 days ago

                I don’t know where you’re getting your definitions but you are wrong.

                Artificial intelligence (AI) is the capability of computational systems to perform tasks typically associated with human intelligence, such as learningreasoningproblem-solvingperception, and decision-making.

                For example, the humble A* Pathfinding Algorithm falls under the domain of AI, despite it being a relatively simple and common process. Even fixing small problems is still considered problem solving.

                • leisesprecher@feddit.org
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  2 days ago

                  I’m sorry, but that’s the worst possible conclusion you can get from that paragraph.

                  Again, think your argument to the end. What would not fall under AI in your world? If A* counts, then literally everything with a simple ‘if’ statement would also count. That’s delusional.

                  Do actually read the article and the articles linked. Are you really, really implying that a simple math equation, that can be solved by a handful transistors and capacitors if need be, is doing something “typically associated with human intelligence”? Really?

                  • MrLLM@ani.social
                    link
                    fedilink
                    English
                    arrow-up
                    0
                    ·
                    2 days ago

                    Like, are you seriously saying that everyone in Wikipedia is wrong but you? You’re the only one delusional here.

                    Believe or not, a bunch of if statements can mimic intelligent behavior, again, not like it’s intelligent, it looks like which is the whole point (that you obviously missed out)

            • leisesprecher@feddit.org
              link
              fedilink
              arrow-up
              0
              ·
              2 days ago

              To a certain extent, yes.

              ChatGPT was never explicitly trained to produce code or translate text, but it can do it. Not super good, but it manages some reasonable output most of the time.