• AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    3
    ·
    edit-2
    8 months ago

    I think the problem is the conflicting goals that Google has with that chip. They want the chip to be able to run AI stuff locally with the Edge TPU ASIC that it includes, but at the same time Google also wants to make money by having Pixel devices offload AI tasks to the cloud. Google can’t reconcile these two goals.

    • Kushan@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      2
      ·
      8 months ago

      I don’t think they’re opposing goals. Google does not make more money from a task running in its cloud than on its devices, if anything that costs them more money.

      • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        edit-2
        8 months ago

        I think it’s realistic to assume that Google is going to impose quotas on those “free” AI features that are running on the cloud right now and have people pay for more quota. It makes no economic sense for Google to keep offering those compute services for free. Remember Google Colab? Started completely free with V100 and A100 GPUs, now you have to pay to just keep using a simple T4 GPU without interruption.

        • Kushan@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Google makes money from ads that they’re going to serve you no matter where they process your data.

          Google is going to pull all that metadata from your device regardless of where it was processed.

          Servers cost Google money to run. It costs them nothing to run something on your device. They clearly have a vested interest in running it on your device if they can.

    • nottheengineer@feddit.de
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      8 months ago

      There’s a solution: Charge the customer once for the hardware and then add a monthly fee to be able to use all of it. Sony and Microsoft have great success with that.

          • 520@kbin.social
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            8 months ago

            Lol you wut?

            Do you know how expensive conventional AI setups are? An unlocked AI chip on a phone would fast replace nVidia cards in the AI scene for low level researchers, especially those dealing with sensitive data for whom cloud access is not viable.

            My laptop is $1500, and is just about viable for this kind of stuff. It took it three days non-stop to create a trading model for ~22 stocks, processing 10 years worth of data for each.

            Now maybe it doesn’t mean much for the consumer, that’s true. It means a hell of a lot for small time developers though, including those developing the apps consumers use.