Connect with us

Technology

Nvidia Unveils NIM for Seamless Deployment of AI Models in Production

Published

on

Nvidia unveiled Nvidia NIM, a new software platform intended to speed up the deployment of personalized and pre-trained AI models into production environments, at its GTC conference today. By combining a model with an optimized inferencing engine and packing it into a container that can be accessed as a microservice, NIM takes the software work that Nvidia has done around inferencing and optimizing models and makes it easily accessible.

According to Nvidia, if the company had any internal AI talent at all, it would normally take developers weeks, if not months, to ship similar containers. For businesses looking to accelerate their AI roadmap, Nvidia’s NIM clearly aims to build an ecosystem of AI-ready containers that use its hardware as the base layer and these carefully chosen microservices as the main software layer.

Currently, NIM supports open models from Google, Hugging Face, Meta, Microsoft, Mistral AI, Stability AI, A121, Adept, Cohere, Getty Images, and Shutterstock in addition to models from NVIDIA. To make these NIM microservices available on SageMaker, Kubernetes Engine, and Azure AI, respectively, Nvidia is already collaborating with Amazon, Google, and Microsoft. Additionally, they’ll be incorporated into LlamaIndex, LangChain, and Deepset frameworks.

In a press conference held prior to today’s announcements, Manuvir Das, Nvidia’s head of enterprise computing, stated, “We believe that the Nvidia GPU is the best place to run inference of these models on […] and we believe that NVIDIA NIM is the best software package, the best runtime, for developers to build on top of so that they can focus on the enterprise applications — and just let Nvidia do the work to produce these models for them in the most efficient, enterprise-grade manner, so that they can just do the rest of their work.”“

TensorRT, TensorRT-LLM, and Triton Inference Server will be the inference engines used by Nvidia. Nvidia microservices that will be made available via NIM include the Earth-2 model for weather and climate simulations, cuOpt for routing optimizations, and Riva for customizing speech and translation models.

The Nvidia RAG LLM operator, for instance, will soon be available as a NIM, a move that the company hopes will simplify the process of creating generative AI chatbots that can extract unique data.

Without a few announcements from partners and customers, this wouldn’t be a developer conference. Presently, NIM’s clientele includes companies like Box, Cloudera, Cohesity, Datastax, Dropbox, and NetApp.

NVIDIA founder and CEO Jensen Huang stated, “Established enterprise platforms are sitting on a goldmine of data that can be transformed into generative AI copilots.” “These containerized AI microservices, developed with our partner ecosystem, are the building blocks for enterprises in every industry to become AI companies.”

Technology

Biden, Kishida Secure Support from Amazon and Nvidia for $50 Million Joint AI Research Program

Published

on

As the two countries seek to enhance cooperation around the rapidly advancing technology, President Joe Biden and Japanese Prime Minister Fumio Kishida have enlisted Amazon.com Inc. and Nvidia Corp. to fund a new joint artificial intelligence research program.

A senior US official briefed reporters prior to Wednesday’s official visit at the White House, stating that the $50 million project will be a collaborative effort between Tsukuba University outside of Tokyo and the University of Washington in Seattle. A separate collaborative AI research program between Carnegie Mellon University in Pittsburgh and Tokyo’s Keio University is also being planned by the two nations.

The push for greater research into artificial intelligence comes as the Biden administration is weighing a series of new regulations designed to minimize the risks of AI technology, which has developed as a key focus for tech companies. The White House announced late last month that federal agencies have until the end of the year to determine how they will assess, test, and monitor the impact of government use of AI technology.

In addition to the university-led projects, Microsoft Corp. announced on Tuesday that it would invest $2.9 billion to expand its cloud computing and artificial intelligence infrastructure in Japan. Brad Smith, the president of Microsoft, met with Kishida on Tuesday. The company released a statement announcing its intention to establish a new AI and robotics lab in Japan.

Kishida, the second-largest economy in Asia, urged American business executives to invest more in Japan’s developing technologies on Tuesday.

“Your investments will enable Japan’s economic growth — which will also be capital for more investments from Japan to the US,” Kishida said at a roundtable with business leaders in Washington.

Continue Reading

Technology

OnePlus and OPPO Collaborate with Google to Introduce Gemini Models for Enhanced Smartphone AI

Published

on

As anticipated, original equipment manufacturers, or OEMs, are heavily integrating AI into their products. Google is working with OnePlus, OPPO, and other companies to integrate Gemini models into their smartphones. They intend to introduce the Gemini models on smartphones later this year, becoming the first OEMs to do so. Gemini models will go on sale later in 2024, as announced at the Google Cloud Next 24 event. Gemini models are designed to provide users with an enhanced artificial intelligence (AI) experience on their gadgets.

Customers in China can now create AI content on-the-go with devices like the OnePlus 12 and OPPO Find X7 thanks to OnePlus and OPPO’s Generative AI models.

The AI Eraser tool was recently made available to all OnePlus customers worldwide. This AI-powered tool lets users remove unwanted objects from their photos. For OnePlus and OPPO, AI Eraser is only the beginning.

In the future, the businesses hope to add more AI-powered features like creating original social media content and summarizing news stories and audio.

AndesGPT LLM from OnePlus and OPPO powers AI Eraser. Even though the Samsung Galaxy S24 and Google Pixel 8 series already have this feature, it is still encouraging to see OnePlus and OPPO taking the initiative to include AI capabilities in their products.

OnePlus and OPPO devices will be able to provide customers with a more comprehensive and sophisticated AI experience with the release of the Gemini models. It is important to remember that OnePlus and OPPO already power the Trinity Engine, which makes using phones incredibly smooth, and use AI and computational mathematics to enhance mobile photography.

By 2024, more original equipment manufacturers should have AI capabilities on their products. This is probably going to help Google because OEMs will use Gemini as the foundation upon which to build their features.

Continue Reading

Technology

Meta Explores AI-Enabled Search Bar on Instagram

Published

on

In an attempt to expand the user base for its generative AI-powered products, Meta is moving forward. The business is experimenting with inserting Meta AI into the Instagram search bar for both chat with AI and content discovery, in addition to testing the chatbot Meta AI with users in nations like India on WhatsApp.

When you type a query into the search bar, Meta AI initiates a direct message (DM) exchange in which you can ask questions or respond to pre-programmed prompts. Aravind Srinivas, CEO of Perplexity AI, pointed out that the prompt screen’s design is similar to the startup’s search screen.

Plus, it might make it easier for you to find fresh Instagram content. As demonstrated in a user-posted video on Threads, you can search for Reels related to a particular topic by tapping on a prompt such as “Beautiful Maui sunset Reels.”

Additionally, TechCrunch spoke with a few users who had the ability to instruct Meta AI to look for recommendations for Reels.

By using generative AI to surface new content from networks like Instagram, Meta hopes to go beyond text generation.

With TechCrunch, Meta verified the results of its Instagram AI experiment. But the company didn’t say whether or not it uses generative AI technology for search.

A Meta representative told TechCrunch, “We’re testing a range of our generative AI-powered experiences publicly in a limited capacity. They are under development in varying phases.”

There are a ton of posts available discussing Instagram search quality. It is therefore not surprising that Meta would want to enhance search through the use of generative AI.

Furthermore, Instagram should be easier to find than TikTok, according to Meta. In order to display results from Reddit and TikTok, Google unveiled a new perspectives feature last year. Instagram is developing a feature called “Visibility off Instagram” that could allow posts to appear in search engine results, according to reverse engineer Alessandro Paluzzi, who made this discovery earlier this week on X.

Continue Reading

Trending

error: Content is protected !!