Despite the rush to integrate powerful new models, about 5% of AI pilot programs achieve rapid revenue acceleration; the vast majority stall, delivering little to no measurable impact on P&L.

The research—based on 150 interviews with leaders, a survey of 350 employees, and an analysis of 300 public AI deployments—paints a clear divide between success stories and stalled projects.

    • ameancow@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      3 days ago

      I know you’re joking, but for those who don’t, the headline means “startups” and they just wanted to avoid the overused term.

      Also, yeah actually it’s far easier to have an AI fly a plane than a car. No obstacles, no sudden changes, no little kids running out from behind a cloud-bank, no traffic except during takeoff and landing, and those systems also can be automated more and more.

      In fact, we don’t need “AI” we’ve had autopilots that handle almost all aspects of flight for decades now. The FA-18 Hornet famously has hand-grips by the seat that the pilot is supposed to hold onto during takeoff so they don’t accidentally touch a control.

      • Frezik@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        0
        ·
        3 days ago

        Conversely, AI running ATC would be a very good thing. To a point.

        It’s been technically feasible for a while to handle 99% of what an ATC does automatically. The problem is that you really want a human to step in on those 1% of situations where things get complicated and really dangerous. Except, the human won’t have their skills sharpened through constant use unless they’re handling at least some of the regular traffic.

        Trick has been to have the AI do, say, 70% of the job, but having a human step in sometimes. Deciding on when to have a human step in is the hard problem.

          • leisesprecher@feddit.org
            link
            fedilink
            arrow-up
            0
            ·
            3 days ago

            No, it does not.

            A deterministic, narrow algorithm that solves exactly one problem is not an AI. Otherwise Pythagoras would count as AI, or any other mathematical formula for that matter.

            Intelligence, even in terms of AI, means being able to solve new problems. An autopilot can’t do anything else than piloting a specific aircraft - and that’s a good thing.

            • wheezy@lemmy.ml
              link
              fedilink
              arrow-up
              0
              ·
              3 days ago

              Not sure why you’re getting downvoted. Well, I guess I do. AI marketing has ruined the meaning of the word to the extent that an if statement is “AI”.

              • fmstrat@lemmy.nowsci.com
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 days ago

                Because they are wrong. Airplane Autopilot is not “one model”, it’s a complex set of systems that take actions based on a trained model. The training of that model used standard ML practices. Sure, it’s a base algorithm, but it follows the same principles. That’s textbook AI.

                No one would have debated this pre-LLM. That being said, if I was in the industry, I’d be calling it an algorithm instead of AI, because those out of the know, well, won’t get it.

            • MrLLM@ani.social
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              3 days ago

              Intelligence, even in terms of AI, means being able to solve new problems.

              I’d argue that an artificial intelligence is (usually computational) a system that can mimic an specific behavior that we consider intelligent, deterministic or not, like playing chess, writing text, piloting an aircraft, etc.

              • leisesprecher@feddit.org
                link
                fedilink
                arrow-up
                0
                ·
                3 days ago

                And you’d argue wrong here, that is simply not the definition of intelligence.

                Extend your logic a bit. Playing an instrument requires intelligence. Is a drum computer intelligent? A mechanical music box?

                Yes, the definition of intelligence is vague, but that doesn’t mean you can extend it indefinitely.

                • MrLLM@ani.social
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 days ago

                  I wanna point out three things:

                  1. How can you tell someone is wrong when you have no idea?
                  2. I think you missed the point, I said artificial intelligence, not intelligence as a whole.
                  3. Yes, playing an instrument in a way that makes sense requires certain degree of intelligence, the music box inherently is not intelligent, but intelligence was required to build it.
                  • leisesprecher@feddit.org
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    3 days ago
                    1. That’s a weak argument without substance. “No, you!” is not exactly a good counter.

                    2. Yes, that’s exactly what I’m talking about, which refutes your argument in 1).

                    3. That’s a whole different discussion. That intelligence is required to build something has nothing to do with whether the product is intelligent. The fact that you manage to mangle that up so bad is almost worrying.

                • Pennomi@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  3 days ago

                  I don’t know where you’re getting your definitions but you are wrong.

                  Artificial intelligence (AI) is the capability of computational systems to perform tasks typically associated with human intelligence, such as learningreasoningproblem-solvingperception, and decision-making.

                  For example, the humble A* Pathfinding Algorithm falls under the domain of AI, despite it being a relatively simple and common process. Even fixing small problems is still considered problem solving.

                  • leisesprecher@feddit.org
                    link
                    fedilink
                    arrow-up
                    0
                    ·
                    3 days ago

                    I’m sorry, but that’s the worst possible conclusion you can get from that paragraph.

                    Again, think your argument to the end. What would not fall under AI in your world? If A* counts, then literally everything with a simple ‘if’ statement would also count. That’s delusional.

                    Do actually read the article and the articles linked. Are you really, really implying that a simple math equation, that can be solved by a handful transistors and capacitors if need be, is doing something “typically associated with human intelligence”? Really?

              • leisesprecher@feddit.org
                link
                fedilink
                arrow-up
                0
                ·
                3 days ago

                To a certain extent, yes.

                ChatGPT was never explicitly trained to produce code or translate text, but it can do it. Not super good, but it manages some reasonable output most of the time.