In any case, the change in policy certainly appears to have influenced the PaLM 2 research paper, which in contrast to the paper detailing PaLM doesn’t even disclose the exact hardware setup with which PaLM 2 was trained. OpenAI arguably set the tone with its GPT-4 paper earlier this year, which researchers criticized for withholding key information about the model’s makeup. According to a recent Business Insider report, Google intends to be “more strategic” about the AI research it publishes to “compete and keep knowledge in house,” in light of the intensifying competition from Microsoft and OpenAI. The lack of transparency isn’t surprising. The co-authors of the paper do claim that the dataset includes a higher percentage of non-English data, but it’s unclear where, exactly, this data came from. On the subject of opaqueness, the 91-page paper, published today, doesn’t reveal which data exactly was used to train PaLM 2 - save that it was a collection of web documents, books, code, mathematics and conversational data “significantly larger” than that used to train PaLM v1. But despite some opaqueness where it concerns PaLM 2’s technical specs, the paper is forthcoming about many of the model’s major limitations. Google claims that it’s a significant improvement over its predecessor and that it even bests OpenAI’s GPT-4, depending on the task at hand.Ībsent some hands-on time with PaLM 2, we only have the accompanying Google-authored research paper to go by. Our news is hyperlocal and is targettedtowards the local audience keeping it more relevant and closer to the info needs of the user.”(function() ) At its annual I/O conference, Google unveiled PaLM 2, the successor to its PaLM large language model for understanding and generating multilingual text. It works with just a swipe and doesn’t need one to understand too many functions. Ask him what his USP was and Raju says, “The simplicity of our design and usability is what is keeping our users hooked. ![]() In just three years, the company has over 20 million downloads and about 2 million active users. The reason why we expanded into vernaculars is that we needed to have a USP in comparision to the other information portals or apps cropping up in the market,” says Raju. ![]() Way2News is a hyperlocal short news app based in Hyderabad provides personalized news in short-summarized format in eight Indian languages - Hindi, Telugu, Tamil, Malayalam, Marathi, Guajarati, Kannada and Bengali. With the help of the userbase that they already had with Way2SMS, they pivoted to Way2News and are now going strong. The Hyderabadi startup headed by Raju Vanapala still didn’t lose hope. As Whatsapp took over along with the various chat apps, the service turned irrelevant. It was a portal where you could send your message from your mobile number as an SMS to your desired number through the internet without a limit. Srividya Remember when we used to have a limit on how many text messages we could send our friends? Most of us tech-savvy Hyderabadis then had a cheat code to send unlimited SMSs through the internet before the era of Whatsapp. And it flaunts more than 4.5 billion screen views per month. It has published 1,40,000 plus short news stories covering 1,10,000 villages of the 6 lakh plus villages in India. With the use of Artificial Intelligence, users are exposed to relevant and interested personalized stories from more than 140 categories.First-hand crowdsourced content, catering to the regional audience has been one of the prime focus.” The company currently serves more than 1000 brands. He also proudly adds, “ Our professional editors, work round-the-clock curating credible, informative, time critical and trusted content for its users. Our news is hyperlocal and is targettedtowards the local audience keeping it more relevant and closer to the info needs of the user.” From a messaging service we turned to an information portal. ![]() “We wanted to put to use the strong userbase we had generated through Way2SMS. In 2016, Raju decided that they had to pivot their business to survive.
0 Comments
Leave a Reply. |