From the (middle of the) story: The reason CES was so packed with random “AI”-branded products was that sticking those two letters to a new company is seen as something of a talisman, a ritual to bring back the (VC) rainy season.
From the (middle of the) story: The reason CES was so packed with random “AI”-branded products was that sticking those two letters to a new company is seen as something of a talisman, a ritual to bring back the (VC) rainy season.
From a consumer perspective, it’s less flashy, I guess. It’s helped me figure out things that I can’t find on a search engine, but that’s not quite as big. From an engineer’s perspective, all the tech for Google maps existed at the time, and for certain users accurate GPS with maps was already an established thing in 2000. On the other hand, we’d been trying to do anything useful with natural language since the 50’s and had thoroughly failed.
From a business perspective, being able to lay off every order taker at your restaurant chain (and maybe the middle managers and bookkeepers too) is huge. It’s obviously huge for order takers, and it’s pretty big for the restaurant owners and anyone who eats at restaurants as well. I think that qualifies as “eating the world”.
That’s really not true. For instance, machine translation and spam detection (document classification) were getting really good by the late 2000s. Image recognition was great beginning the late 2010s.
What we’ve seen in the last few years (besides continual incremental improvements in already-existing solutions) is improvement in the application of generative tools. So far the uses cases of generative models appear to be violating copyright, cheating on homework, and producing even more search engine spam. It can also be somewhat useful as a search engine so long as you want your answer to be authoritatively worded but don’t care if it’s true or not.